44071 1727204587.46748: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-MVC executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 44071 1727204587.47063: Added group all to inventory 44071 1727204587.47067: Added group ungrouped to inventory 44071 1727204587.47070: Group all now contains ungrouped 44071 1727204587.47073: Examining possible inventory source: /tmp/network-jrl/inventory-0Xx.yml 44071 1727204587.57757: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 44071 1727204587.57807: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 44071 1727204587.57825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 44071 1727204587.57872: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 44071 1727204587.57927: Loaded config def from plugin (inventory/script) 44071 1727204587.57929: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 44071 1727204587.57960: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 44071 1727204587.58026: Loaded config def from plugin (inventory/yaml) 44071 1727204587.58028: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 44071 1727204587.58096: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 44071 1727204587.58418: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 44071 1727204587.58420: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 44071 1727204587.58423: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 44071 1727204587.58427: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 44071 1727204587.58431: Loading data from /tmp/network-jrl/inventory-0Xx.yml 44071 1727204587.58482: /tmp/network-jrl/inventory-0Xx.yml was not parsable by auto 44071 1727204587.58528: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 44071 1727204587.58562: Loading data from /tmp/network-jrl/inventory-0Xx.yml 44071 1727204587.58623: group all already in inventory 44071 1727204587.58628: set inventory_file for managed-node1 44071 1727204587.58631: set inventory_dir for managed-node1 44071 1727204587.58632: Added host managed-node1 to inventory 44071 1727204587.58636: Added host managed-node1 to group all 44071 1727204587.58637: set ansible_host for managed-node1 44071 1727204587.58637: set ansible_ssh_extra_args for managed-node1 44071 1727204587.58640: set inventory_file for managed-node2 44071 1727204587.58642: set inventory_dir for managed-node2 44071 1727204587.58642: Added host managed-node2 to inventory 44071 1727204587.58643: Added host managed-node2 to group all 44071 1727204587.58644: set ansible_host for managed-node2 44071 1727204587.58644: set ansible_ssh_extra_args for managed-node2 44071 1727204587.58646: set inventory_file for managed-node3 44071 1727204587.58647: set inventory_dir for managed-node3 44071 1727204587.58648: Added host managed-node3 to inventory 44071 1727204587.58649: Added host managed-node3 to group all 44071 1727204587.58649: set ansible_host for managed-node3 44071 1727204587.58650: set ansible_ssh_extra_args for managed-node3 44071 1727204587.58652: Reconcile groups and hosts in inventory. 44071 1727204587.58654: Group ungrouped now contains managed-node1 44071 1727204587.58656: Group ungrouped now contains managed-node2 44071 1727204587.58657: Group ungrouped now contains managed-node3 44071 1727204587.58720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 44071 1727204587.58814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 44071 1727204587.58851: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 44071 1727204587.58873: Loaded config def from plugin (vars/host_group_vars) 44071 1727204587.58875: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 44071 1727204587.58880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 44071 1727204587.58886: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 44071 1727204587.58920: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 44071 1727204587.59186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204587.59260: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 44071 1727204587.59288: Loaded config def from plugin (connection/local) 44071 1727204587.59290: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 44071 1727204587.59745: Loaded config def from plugin (connection/paramiko_ssh) 44071 1727204587.59748: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 44071 1727204587.60401: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 44071 1727204587.60432: Loaded config def from plugin (connection/psrp) 44071 1727204587.60436: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 44071 1727204587.60925: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 44071 1727204587.60955: Loaded config def from plugin (connection/ssh) 44071 1727204587.60957: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 44071 1727204587.62472: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 44071 1727204587.62503: Loaded config def from plugin (connection/winrm) 44071 1727204587.62506: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 44071 1727204587.62531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 44071 1727204587.62586: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 44071 1727204587.62636: Loaded config def from plugin (shell/cmd) 44071 1727204587.62638: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 44071 1727204587.62658: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 44071 1727204587.62702: Loaded config def from plugin (shell/powershell) 44071 1727204587.62703: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 44071 1727204587.62748: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 44071 1727204587.62871: Loaded config def from plugin (shell/sh) 44071 1727204587.62873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 44071 1727204587.62898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 44071 1727204587.62985: Loaded config def from plugin (become/runas) 44071 1727204587.62986: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 44071 1727204587.63111: Loaded config def from plugin (become/su) 44071 1727204587.63112: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 44071 1727204587.63222: Loaded config def from plugin (become/sudo) 44071 1727204587.63224: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 44071 1727204587.63254: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 44071 1727204587.63501: in VariableManager get_vars() 44071 1727204587.63517: done with get_vars() 44071 1727204587.63625: trying /usr/local/lib/python3.12/site-packages/ansible/modules 44071 1727204587.65810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 44071 1727204587.65899: in VariableManager get_vars() 44071 1727204587.65903: done with get_vars() 44071 1727204587.65905: variable 'playbook_dir' from source: magic vars 44071 1727204587.65906: variable 'ansible_playbook_python' from source: magic vars 44071 1727204587.65906: variable 'ansible_config_file' from source: magic vars 44071 1727204587.65907: variable 'groups' from source: magic vars 44071 1727204587.65907: variable 'omit' from source: magic vars 44071 1727204587.65908: variable 'ansible_version' from source: magic vars 44071 1727204587.65908: variable 'ansible_check_mode' from source: magic vars 44071 1727204587.65909: variable 'ansible_diff_mode' from source: magic vars 44071 1727204587.65909: variable 'ansible_forks' from source: magic vars 44071 1727204587.65910: variable 'ansible_inventory_sources' from source: magic vars 44071 1727204587.65910: variable 'ansible_skip_tags' from source: magic vars 44071 1727204587.65911: variable 'ansible_limit' from source: magic vars 44071 1727204587.65911: variable 'ansible_run_tags' from source: magic vars 44071 1727204587.65912: variable 'ansible_verbosity' from source: magic vars 44071 1727204587.65942: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml 44071 1727204587.66399: in VariableManager get_vars() 44071 1727204587.66412: done with get_vars() 44071 1727204587.66451: in VariableManager get_vars() 44071 1727204587.66461: done with get_vars() 44071 1727204587.66495: in VariableManager get_vars() 44071 1727204587.66504: done with get_vars() 44071 1727204587.66537: in VariableManager get_vars() 44071 1727204587.66548: done with get_vars() 44071 1727204587.66584: in VariableManager get_vars() 44071 1727204587.66594: done with get_vars() 44071 1727204587.66625: in VariableManager get_vars() 44071 1727204587.66635: done with get_vars() 44071 1727204587.66680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 44071 1727204587.66690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 44071 1727204587.66876: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 44071 1727204587.66994: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 44071 1727204587.66997: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 44071 1727204587.67020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 44071 1727204587.67041: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 44071 1727204587.67155: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 44071 1727204587.67199: Loaded config def from plugin (callback/default) 44071 1727204587.67202: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 44071 1727204587.68148: Loaded config def from plugin (callback/junit) 44071 1727204587.68150: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 44071 1727204587.68191: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 44071 1727204587.68237: Loaded config def from plugin (callback/minimal) 44071 1727204587.68239: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 44071 1727204587.68270: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 44071 1727204587.68314: Loaded config def from plugin (callback/tree) 44071 1727204587.68316: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 44071 1727204587.68403: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 44071 1727204587.68405: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_states_nm.yml ************************************************** 2 plays in /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 44071 1727204587.68432: in VariableManager get_vars() 44071 1727204587.68445: done with get_vars() 44071 1727204587.68450: in VariableManager get_vars() 44071 1727204587.68455: done with get_vars() 44071 1727204587.68458: variable 'omit' from source: magic vars 44071 1727204587.68487: in VariableManager get_vars() 44071 1727204587.68497: done with get_vars() 44071 1727204587.68514: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_states.yml' with nm as provider] *********** 44071 1727204587.70928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 44071 1727204587.70992: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 44071 1727204587.71020: getting the remaining hosts for this loop 44071 1727204587.71022: done getting the remaining hosts for this loop 44071 1727204587.71024: getting the next task for host managed-node2 44071 1727204587.71027: done getting next task for host managed-node2 44071 1727204587.71029: ^ task is: TASK: Gathering Facts 44071 1727204587.71030: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204587.71032: getting variables 44071 1727204587.71032: in VariableManager get_vars() 44071 1727204587.71044: Calling all_inventory to load vars for managed-node2 44071 1727204587.71045: Calling groups_inventory to load vars for managed-node2 44071 1727204587.71047: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204587.71058: Calling all_plugins_play to load vars for managed-node2 44071 1727204587.71073: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204587.71076: Calling groups_plugins_play to load vars for managed-node2 44071 1727204587.71103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204587.71142: done with get_vars() 44071 1727204587.71148: done getting variables 44071 1727204587.71202: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 Tuesday 24 September 2024 15:03:07 -0400 (0:00:00.028) 0:00:00.028 ***** 44071 1727204587.71223: entering _queue_task() for managed-node2/gather_facts 44071 1727204587.71224: Creating lock for gather_facts 44071 1727204587.71531: worker is 1 (out of 1 available) 44071 1727204587.71545: exiting _queue_task() for managed-node2/gather_facts 44071 1727204587.71563: done queuing things up, now waiting for results queue to drain 44071 1727204587.71567: waiting for pending results... 44071 1727204587.71702: running TaskExecutor() for managed-node2/TASK: Gathering Facts 44071 1727204587.71768: in run() - task 127b8e07-fff9-c964-7471-00000000001b 44071 1727204587.71780: variable 'ansible_search_path' from source: unknown 44071 1727204587.71811: calling self._execute() 44071 1727204587.71900: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204587.71905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204587.71914: variable 'omit' from source: magic vars 44071 1727204587.71995: variable 'omit' from source: magic vars 44071 1727204587.72016: variable 'omit' from source: magic vars 44071 1727204587.72047: variable 'omit' from source: magic vars 44071 1727204587.72087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204587.72116: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204587.72132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204587.72153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204587.72160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204587.72186: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204587.72189: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204587.72193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204587.72268: Set connection var ansible_connection to ssh 44071 1727204587.72277: Set connection var ansible_timeout to 10 44071 1727204587.72281: Set connection var ansible_pipelining to False 44071 1727204587.72287: Set connection var ansible_shell_type to sh 44071 1727204587.72293: Set connection var ansible_shell_executable to /bin/sh 44071 1727204587.72300: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204587.72317: variable 'ansible_shell_executable' from source: unknown 44071 1727204587.72320: variable 'ansible_connection' from source: unknown 44071 1727204587.72323: variable 'ansible_module_compression' from source: unknown 44071 1727204587.72326: variable 'ansible_shell_type' from source: unknown 44071 1727204587.72328: variable 'ansible_shell_executable' from source: unknown 44071 1727204587.72331: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204587.72337: variable 'ansible_pipelining' from source: unknown 44071 1727204587.72340: variable 'ansible_timeout' from source: unknown 44071 1727204587.72342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204587.72495: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 44071 1727204587.72505: variable 'omit' from source: magic vars 44071 1727204587.72508: starting attempt loop 44071 1727204587.72511: running the handler 44071 1727204587.72525: variable 'ansible_facts' from source: unknown 44071 1727204587.72541: _low_level_execute_command(): starting 44071 1727204587.72549: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204587.73338: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204587.73357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204587.73383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204587.73494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204587.75276: stdout chunk (state=3): >>>/root <<< 44071 1727204587.75582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204587.75586: stdout chunk (state=3): >>><<< 44071 1727204587.75588: stderr chunk (state=3): >>><<< 44071 1727204587.75591: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204587.75595: _low_level_execute_command(): starting 44071 1727204587.75598: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204587.754917-44077-186011612748606 `" && echo ansible-tmp-1727204587.754917-44077-186011612748606="` echo /root/.ansible/tmp/ansible-tmp-1727204587.754917-44077-186011612748606 `" ) && sleep 0' 44071 1727204587.76245: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204587.76270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204587.76352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204587.76406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204587.76434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204587.76456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204587.76572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204587.78575: stdout chunk (state=3): >>>ansible-tmp-1727204587.754917-44077-186011612748606=/root/.ansible/tmp/ansible-tmp-1727204587.754917-44077-186011612748606 <<< 44071 1727204587.78870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204587.78875: stderr chunk (state=3): >>><<< 44071 1727204587.78879: stdout chunk (state=3): >>><<< 44071 1727204587.78881: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204587.754917-44077-186011612748606=/root/.ansible/tmp/ansible-tmp-1727204587.754917-44077-186011612748606 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204587.78883: variable 'ansible_module_compression' from source: unknown 44071 1727204587.79073: ANSIBALLZ: Using generic lock for ansible.legacy.setup 44071 1727204587.79076: ANSIBALLZ: Acquiring lock 44071 1727204587.79078: ANSIBALLZ: Lock acquired: 140077513493248 44071 1727204587.79080: ANSIBALLZ: Creating module 44071 1727204588.16156: ANSIBALLZ: Writing module into payload 44071 1727204588.16307: ANSIBALLZ: Writing module 44071 1727204588.16331: ANSIBALLZ: Renaming module 44071 1727204588.16338: ANSIBALLZ: Done creating module 44071 1727204588.16380: variable 'ansible_facts' from source: unknown 44071 1727204588.16384: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204588.16391: _low_level_execute_command(): starting 44071 1727204588.16397: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 44071 1727204588.16910: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204588.16914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204588.16918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204588.16921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204588.16979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204588.16983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204588.16985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204588.17068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204588.18869: stdout chunk (state=3): >>>PLATFORM <<< 44071 1727204588.18953: stdout chunk (state=3): >>>Linux <<< 44071 1727204588.18957: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 44071 1727204588.18960: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 44071 1727204588.19108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204588.19184: stderr chunk (state=3): >>><<< 44071 1727204588.19188: stdout chunk (state=3): >>><<< 44071 1727204588.19203: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204588.19216 [managed-node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 44071 1727204588.19263: _low_level_execute_command(): starting 44071 1727204588.19267: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 44071 1727204588.19602: Sending initial data 44071 1727204588.19605: Sent initial data (1181 bytes) 44071 1727204588.20012: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204588.20040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204588.20055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204588.20076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204588.20092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204588.20102: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204588.20114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204588.20144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204588.20256: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204588.20269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204588.20373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204588.24047: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} <<< 44071 1727204588.24630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204588.24637: stdout chunk (state=3): >>><<< 44071 1727204588.24640: stderr chunk (state=3): >>><<< 44071 1727204588.24775: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204588.25179: variable 'ansible_facts' from source: unknown 44071 1727204588.25183: variable 'ansible_facts' from source: unknown 44071 1727204588.25186: variable 'ansible_module_compression' from source: unknown 44071 1727204588.25189: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44071 1727204588.25206: variable 'ansible_facts' from source: unknown 44071 1727204588.25637: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204587.754917-44077-186011612748606/AnsiballZ_setup.py 44071 1727204588.25967: Sending initial data 44071 1727204588.25978: Sent initial data (153 bytes) 44071 1727204588.26923: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204588.27053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204588.27089: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204588.27199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204588.28837: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204588.28860: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204588.28950: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204588.29059: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpu5i4guvx /root/.ansible/tmp/ansible-tmp-1727204587.754917-44077-186011612748606/AnsiballZ_setup.py <<< 44071 1727204588.29084: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204587.754917-44077-186011612748606/AnsiballZ_setup.py" <<< 44071 1727204588.29122: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpu5i4guvx" to remote "/root/.ansible/tmp/ansible-tmp-1727204587.754917-44077-186011612748606/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204587.754917-44077-186011612748606/AnsiballZ_setup.py" <<< 44071 1727204588.30853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204588.30942: stderr chunk (state=3): >>><<< 44071 1727204588.30946: stdout chunk (state=3): >>><<< 44071 1727204588.30968: done transferring module to remote 44071 1727204588.30983: _low_level_execute_command(): starting 44071 1727204588.30988: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204587.754917-44077-186011612748606/ /root/.ansible/tmp/ansible-tmp-1727204587.754917-44077-186011612748606/AnsiballZ_setup.py && sleep 0' 44071 1727204588.31562: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204588.31569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204588.31572: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204588.31574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204588.31644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204588.31727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204588.33678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204588.33683: stdout chunk (state=3): >>><<< 44071 1727204588.33807: stderr chunk (state=3): >>><<< 44071 1727204588.33812: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204588.33815: _low_level_execute_command(): starting 44071 1727204588.33817: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204587.754917-44077-186011612748606/AnsiballZ_setup.py && sleep 0' 44071 1727204588.34479: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204588.34496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204588.34518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204588.34540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204588.34639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204588.34667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204588.34687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204588.34714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204588.34856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204588.37165: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 44071 1727204588.37209: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 44071 1727204588.37292: stdout chunk (state=3): >>>import '_io' # <<< 44071 1727204588.37320: stdout chunk (state=3): >>>import 'marshal' # <<< 44071 1727204588.37342: stdout chunk (state=3): >>>import 'posix' # <<< 44071 1727204588.37381: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 44071 1727204588.37403: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 44071 1727204588.37504: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 44071 1727204588.37515: stdout chunk (state=3): >>>import 'codecs' # <<< 44071 1727204588.37556: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 44071 1727204588.37589: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bd118530> <<< 44071 1727204588.37619: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bd0e7b30> <<< 44071 1727204588.37638: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 44071 1727204588.37669: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bd11aab0> import '_signal' # <<< 44071 1727204588.37696: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 44071 1727204588.37707: stdout chunk (state=3): >>>import 'io' # <<< 44071 1727204588.37745: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 44071 1727204588.37851: stdout chunk (state=3): >>>import '_collections_abc' # <<< 44071 1727204588.37875: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 44071 1727204588.37914: stdout chunk (state=3): >>>import 'os' # <<< 44071 1727204588.37942: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 44071 1727204588.37970: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 44071 1727204588.37987: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 44071 1727204588.38024: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 44071 1727204588.38041: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf0d190> <<< 44071 1727204588.38133: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 44071 1727204588.38136: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf0e090> <<< 44071 1727204588.38162: stdout chunk (state=3): >>>import 'site' # <<< 44071 1727204588.38191: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 44071 1727204588.38615: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 44071 1727204588.38644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204588.38673: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 44071 1727204588.38703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 44071 1727204588.38722: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 44071 1727204588.38755: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 44071 1727204588.38782: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf4be60> <<< 44071 1727204588.38815: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 44071 1727204588.38847: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf4bf20> <<< 44071 1727204588.38851: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 44071 1727204588.38886: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 44071 1727204588.38901: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 44071 1727204588.38969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204588.38975: stdout chunk (state=3): >>>import 'itertools' # <<< 44071 1727204588.39032: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf83830> <<< 44071 1727204588.39035: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 44071 1727204588.39051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf83ec0> <<< 44071 1727204588.39060: stdout chunk (state=3): >>>import '_collections' # <<< 44071 1727204588.39104: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf63b30> <<< 44071 1727204588.39115: stdout chunk (state=3): >>>import '_functools' # <<< 44071 1727204588.39147: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf61250> <<< 44071 1727204588.39242: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf49010> <<< 44071 1727204588.39289: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 44071 1727204588.39293: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 44071 1727204588.39321: stdout chunk (state=3): >>>import '_sre' # <<< 44071 1727204588.39325: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 44071 1727204588.39347: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 44071 1727204588.39379: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 44071 1727204588.39403: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfa7830> <<< 44071 1727204588.39431: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfa6450> <<< 44071 1727204588.39455: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf62120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfa4bf0> <<< 44071 1727204588.39520: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 44071 1727204588.39548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfd88c0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf48290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 44071 1727204588.39598: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204588.39621: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcfd8d70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfd8c20> <<< 44071 1727204588.39642: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcfd8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf46db0> <<< 44071 1727204588.39667: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 44071 1727204588.39694: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 44071 1727204588.39739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 44071 1727204588.39750: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfd96a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfd9370> import 'importlib.machinery' # <<< 44071 1727204588.39783: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 44071 1727204588.39828: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfda5a0> import 'importlib.util' # <<< 44071 1727204588.39832: stdout chunk (state=3): >>>import 'runpy' # <<< 44071 1727204588.39852: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 44071 1727204588.39874: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 44071 1727204588.39926: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcff47d0> <<< 44071 1727204588.39930: stdout chunk (state=3): >>>import 'errno' # <<< 44071 1727204588.39955: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcff5f10> <<< 44071 1727204588.39989: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 44071 1727204588.40020: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 44071 1727204588.40046: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcff6db0> <<< 44071 1727204588.40076: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204588.40113: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcff73e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcff6300> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 44071 1727204588.40116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 44071 1727204588.40164: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204588.40169: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcff7e60> <<< 44071 1727204588.40181: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcff7590> <<< 44071 1727204588.40217: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfda600> <<< 44071 1727204588.40229: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 44071 1727204588.40254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 44071 1727204588.40282: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 44071 1727204588.40293: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 44071 1727204588.40335: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcd2bdd0> <<< 44071 1727204588.40369: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 44071 1727204588.40405: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcd58830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd58590> <<< 44071 1727204588.40414: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204588.40468: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcd587d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcd589e0> <<< 44071 1727204588.40472: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd29f70> <<< 44071 1727204588.40488: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 44071 1727204588.40589: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 44071 1727204588.40612: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd5a0f0> <<< 44071 1727204588.40635: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd58d70> <<< 44071 1727204588.40683: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfdacf0> <<< 44071 1727204588.40686: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 44071 1727204588.40743: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204588.40762: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 44071 1727204588.40802: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 44071 1727204588.40829: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd7e480> <<< 44071 1727204588.40889: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 44071 1727204588.40913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 44071 1727204588.40936: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 44071 1727204588.41007: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd9a5d0> <<< 44071 1727204588.41011: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 44071 1727204588.41047: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 44071 1727204588.41103: stdout chunk (state=3): >>>import 'ntpath' # <<< 44071 1727204588.41134: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcdcf380> <<< 44071 1727204588.41157: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 44071 1727204588.41184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 44071 1727204588.41211: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 44071 1727204588.41251: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 44071 1727204588.41348: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcdf9b20> <<< 44071 1727204588.41417: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcdcf4a0> <<< 44071 1727204588.41483: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd9b260> <<< 44071 1727204588.41505: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcbd4500> <<< 44071 1727204588.41525: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd99610> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd5b020> <<< 44071 1727204588.41695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 44071 1727204588.41714: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff0bcd99730> <<< 44071 1727204588.41882: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_zgl39j0q/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 44071 1727204588.42039: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.42065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 44071 1727204588.42110: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 44071 1727204588.42184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 44071 1727204588.42226: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc3e180> <<< 44071 1727204588.42229: stdout chunk (state=3): >>>import '_typing' # <<< 44071 1727204588.42429: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc15070> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc14200> # zipimport: zlib available <<< 44071 1727204588.42476: stdout chunk (state=3): >>>import 'ansible' # <<< 44071 1727204588.42483: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.42525: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 44071 1727204588.42529: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.44089: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.45389: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc17590> <<< 44071 1727204588.45394: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204588.45460: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 44071 1727204588.45471: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204588.45499: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcc71be0> <<< 44071 1727204588.45503: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc71970> <<< 44071 1727204588.45549: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc71280> <<< 44071 1727204588.45553: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 44071 1727204588.45596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 44071 1727204588.45599: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc71ca0> <<< 44071 1727204588.45611: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc3ee10> import 'atexit' # <<< 44071 1727204588.45635: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcc72960> <<< 44071 1727204588.45675: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcc72ba0> <<< 44071 1727204588.45693: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 44071 1727204588.45740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 44071 1727204588.45744: stdout chunk (state=3): >>>import '_locale' # <<< 44071 1727204588.45792: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc730b0> import 'pwd' # <<< 44071 1727204588.45812: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 44071 1727204588.45838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 44071 1727204588.45883: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcad4e90> <<< 44071 1727204588.45910: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcad6ab0> <<< 44071 1727204588.45929: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 44071 1727204588.45949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 44071 1727204588.45992: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcad73e0> <<< 44071 1727204588.46019: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 44071 1727204588.46043: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 44071 1727204588.46060: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcad85c0> <<< 44071 1727204588.46106: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 44071 1727204588.46113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 44071 1727204588.46128: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 44071 1727204588.46180: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcadb020> <<< 44071 1727204588.46225: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204588.46256: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcadb140> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcad92e0> <<< 44071 1727204588.46259: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 44071 1727204588.46284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 44071 1727204588.46314: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 44071 1727204588.46328: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 44071 1727204588.46374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 44071 1727204588.46395: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcadef60> <<< 44071 1727204588.46409: stdout chunk (state=3): >>>import '_tokenize' # <<< 44071 1727204588.46488: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcadda30> <<< 44071 1727204588.46506: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcadd790> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 44071 1727204588.46583: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcadfd70> <<< 44071 1727204588.46617: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcad97f0> <<< 44071 1727204588.46638: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb23020> <<< 44071 1727204588.46676: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 44071 1727204588.46729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb23290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 44071 1727204588.46733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 44071 1727204588.46747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 44071 1727204588.46775: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb28d40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb28b00> <<< 44071 1727204588.46799: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 44071 1727204588.46887: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 44071 1727204588.46948: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204588.46952: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb2b230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb293a0> <<< 44071 1727204588.46977: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 44071 1727204588.47006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204588.47040: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 44071 1727204588.47051: stdout chunk (state=3): >>>import '_string' # <<< 44071 1727204588.47085: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb329c0> <<< 44071 1727204588.47218: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb2b350> <<< 44071 1727204588.47295: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb33830> <<< 44071 1727204588.47319: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb336b0> <<< 44071 1727204588.47391: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb33b30><<< 44071 1727204588.47395: stdout chunk (state=3): >>> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb23440> <<< 44071 1727204588.47441: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 44071 1727204588.47445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 44071 1727204588.47486: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204588.47503: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb37380> <<< 44071 1727204588.47677: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb38770> <<< 44071 1727204588.47692: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb35b20> <<< 44071 1727204588.47726: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb36ed0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb357f0> # zipimport: zlib available <<< 44071 1727204588.47755: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 44071 1727204588.47770: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.47851: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.47958: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.47992: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 44071 1727204588.48006: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 44071 1727204588.48137: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.48271: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.48860: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.49492: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 44071 1727204588.49569: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204588.49574: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204588.49577: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bc9c07d0> <<< 44071 1727204588.49678: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bc9c1550> <<< 44071 1727204588.49682: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb3bdd0> <<< 44071 1727204588.49780: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 44071 1727204588.49877: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.49880: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.49883: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 44071 1727204588.49937: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.50138: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bc9c1250> <<< 44071 1727204588.50142: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.50643: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.51147: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.51227: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.51310: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 44071 1727204588.51337: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.51360: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.51399: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 44071 1727204588.51413: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.51486: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.51598: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 44071 1727204588.51603: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.51628: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 44071 1727204588.51638: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.51672: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.51719: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 44071 1727204588.51729: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.51994: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.52347: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 44071 1727204588.52391: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bc9c3e60> <<< 44071 1727204588.52404: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.52476: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.52568: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 44071 1727204588.52594: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 44071 1727204588.52613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 44071 1727204588.52873: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204588.52901: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bc9ca0c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bc9ca990> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcdcf410> # zipimport: zlib available <<< 44071 1727204588.53026: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 44071 1727204588.53045: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.53141: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.53159: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.53281: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204588.53487: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bc9c9760> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bc9caae0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 44071 1727204588.53536: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.53621: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44071 1727204588.53672: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204588.53690: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 44071 1727204588.53710: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 44071 1727204588.53972: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 44071 1727204588.53999: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bca62d50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bc9d4b60> <<< 44071 1727204588.54028: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bc9d2c90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bc9d29f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 44071 1727204588.54041: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.54069: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.54100: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 44071 1727204588.54162: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 44071 1727204588.54181: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.54209: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 44071 1727204588.54275: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.54339: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.54376: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.54437: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44071 1727204588.54478: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.54519: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.54664: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 44071 1727204588.54670: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44071 1727204588.54726: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.54755: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.54790: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 44071 1727204588.54871: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.54999: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.55196: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.55233: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.55302: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204588.55507: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bca65af0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 44071 1727204588.55511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 44071 1727204588.55551: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 44071 1727204588.55554: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbf50290> <<< 44071 1727204588.55595: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204588.55610: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bbf505c0> <<< 44071 1727204588.55648: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bca452b0> <<< 44071 1727204588.55676: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bca441a0> <<< 44071 1727204588.55718: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bca641d0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bca64ce0> <<< 44071 1727204588.55837: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 44071 1727204588.55851: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 44071 1727204588.55884: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bbf535f0> <<< 44071 1727204588.55901: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbf52ea0> <<< 44071 1727204588.55934: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bbf53080> <<< 44071 1727204588.55954: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbf522d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 44071 1727204588.56053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 44071 1727204588.56078: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbf53680> <<< 44071 1727204588.56095: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 44071 1727204588.56151: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bbfbe1b0> <<< 44071 1727204588.56207: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbfbc1d0> <<< 44071 1727204588.56243: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bca65280> import 'ansible.module_utils.facts.timeout' # <<< 44071 1727204588.56370: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available <<< 44071 1727204588.56410: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 44071 1727204588.56435: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.56486: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.56541: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 44071 1727204588.56581: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 44071 1727204588.56700: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 44071 1727204588.56718: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.56779: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 44071 1727204588.56797: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.56895: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.56914: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 44071 1727204588.56953: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.57023: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.57086: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.57155: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 44071 1727204588.57171: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.57725: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.58232: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 44071 1727204588.58298: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.58353: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.58443: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.58447: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 44071 1727204588.58550: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.58554: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 44071 1727204588.58581: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.58644: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 44071 1727204588.58902: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.58906: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available <<< 44071 1727204588.58984: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 44071 1727204588.59006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 44071 1727204588.59033: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbfbe450> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 44071 1727204588.59061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 44071 1727204588.59194: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbfbf0b0> import 'ansible.module_utils.facts.system.local' # <<< 44071 1727204588.59214: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.59285: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.59357: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 44071 1727204588.59377: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.59464: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.59562: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 44071 1727204588.59587: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.59646: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.59727: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 44071 1727204588.59738: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.59805: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.59827: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 44071 1727204588.59881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 44071 1727204588.59953: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204588.60028: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bbfea5a0> <<< 44071 1727204588.60234: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbfd6c60> import 'ansible.module_utils.facts.system.python' # <<< 44071 1727204588.60246: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.60304: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.60354: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 44071 1727204588.60484: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.60502: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.60557: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.60681: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.60995: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 44071 1727204588.61041: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 44071 1727204588.61059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 44071 1727204588.61127: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204588.61131: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bbe05ee0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbfea390> import 'ansible.module_utils.facts.system.user' # <<< 44071 1727204588.61133: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.61183: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 44071 1727204588.61216: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.61262: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 44071 1727204588.61273: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.61438: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.61608: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 44071 1727204588.61622: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.61726: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.61835: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.61884: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.61924: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 44071 1727204588.61945: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 44071 1727204588.62073: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.62088: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.62155: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.62313: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 44071 1727204588.62331: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.62463: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.62600: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 44071 1727204588.62620: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.62687: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44071 1727204588.63329: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.63909: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 44071 1727204588.64096: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.64099: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.64144: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 44071 1727204588.64161: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.64388: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 44071 1727204588.64548: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.64721: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 44071 1727204588.64756: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 44071 1727204588.64773: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.64812: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.64861: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 44071 1727204588.64883: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.64984: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.65089: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.65317: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.65538: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 44071 1727204588.65561: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 44071 1727204588.65600: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.65635: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 44071 1727204588.65654: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.65678: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.65713: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 44071 1727204588.65728: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.65801: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.65886: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 44071 1727204588.65897: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.65909: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.65949: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 44071 1727204588.65952: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.66013: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.66079: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 44071 1727204588.66093: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.66156: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.66223: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 44071 1727204588.66226: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.66518: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.66825: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 44071 1727204588.66828: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.66895: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.66981: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 44071 1727204588.67032: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.67054: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # <<< 44071 1727204588.67077: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.67119: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.67153: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 44071 1727204588.67230: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 44071 1727204588.67315: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.67596: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 44071 1727204588.67647: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.67704: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.67776: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.67851: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 44071 1727204588.67876: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 44071 1727204588.67906: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.68017: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.68045: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 44071 1727204588.68225: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.68446: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 44071 1727204588.68473: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.68601: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 44071 1727204588.68614: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.68664: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 44071 1727204588.68685: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.68763: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.68851: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 44071 1727204588.68873: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 44071 1727204588.68888: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.68974: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.69076: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 44071 1727204588.69198: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204588.69362: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 44071 1727204588.69382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 44071 1727204588.69427: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 44071 1727204588.69458: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bbe2eae0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbe2d250> <<< 44071 1727204588.69572: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbe2cb60> <<< 44071 1727204588.81449: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 44071 1727204588.81472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 44071 1727204588.81483: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbe74ef0> <<< 44071 1727204588.81510: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 44071 1727204588.81519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 44071 1727204588.81551: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbe74ec0> <<< 44071 1727204588.81607: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 44071 1727204588.81609: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204588.81639: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 44071 1727204588.81642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbe76480> <<< 44071 1727204588.81688: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbe75f40> <<< 44071 1727204588.81940: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 44071 1727204589.06258: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3050, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 666, "free": 3050}, "nocache": {"free": 3497, "used": 219}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 935, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [<<< 44071 1727204589.06290: stdout chunk (state=3): >>>{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251310804992, "block_size": 4096, "block_total": 64479564, "block_available": 61355177, "block_used": 3124387, "inode_total": 16384000, "inode_available": 16301246, "inode_used": 82754, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]",<<< 44071 1727204589.06301: stdout chunk (state=3): >>> "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_is_chroot": false, "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "09", "epoch": "1727204589", "epoch_int": "1727204589", "date": "2024-09-24", "time": "15:03:09", "iso8601_micro": "2024-09-24T19:03:09.058482Z", "iso8601": "2024-09-24T19:03:09Z", "iso8601_basic": "20240924T150309058482", "iso8601_basic_short": "20240924T150309", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.958984375, "5m": 0.66162109375, "15m": 0.4033203125}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44071 1727204589.06901: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 44071 1727204589.06916: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path <<< 44071 1727204589.06931: stdout chunk (state=3): >>># restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal <<< 44071 1727204589.06941: stdout chunk (state=3): >>># cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath <<< 44071 1727204589.06978: stdout chunk (state=3): >>># cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler <<< 44071 1727204589.06991: stdout chunk (state=3): >>># cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib <<< 44071 1727204589.06997: stdout chunk (state=3): >>># cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math <<< 44071 1727204589.07028: stdout chunk (state=3): >>># cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible <<< 44071 1727204589.07048: stdout chunk (state=3): >>># destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale <<< 44071 1727204589.07056: stdout chunk (state=3): >>># cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors <<< 44071 1727204589.07070: stdout chunk (state=3): >>># cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback <<< 44071 1727204589.07091: stdout chunk (state=3): >>># cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes <<< 44071 1727204589.07125: stdout chunk (state=3): >>># destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings <<< 44071 1727204589.07130: stdout chunk (state=3): >>># destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters <<< 44071 1727204589.07144: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale <<< 44071 1727204589.07164: stdout chunk (state=3): >>># cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro <<< 44071 1727204589.07169: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction <<< 44071 1727204589.07188: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing <<< 44071 1727204589.07198: stdout chunk (state=3): >>># cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other<<< 44071 1727204589.07250: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips <<< 44071 1727204589.07266: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux <<< 44071 1727204589.07270: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos <<< 44071 1727204589.07273: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl<<< 44071 1727204589.07293: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg <<< 44071 1727204589.07310: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly <<< 44071 1727204589.07320: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd <<< 44071 1727204589.07335: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly <<< 44071 1727204589.07349: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 44071 1727204589.07719: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 44071 1727204589.07724: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 44071 1727204589.07747: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 44071 1727204589.07775: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma<<< 44071 1727204589.07780: stdout chunk (state=3): >>> # destroy zipfile._path <<< 44071 1727204589.07797: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 44071 1727204589.07838: stdout chunk (state=3): >>># destroy ntpath <<< 44071 1727204589.07843: stdout chunk (state=3): >>># destroy importlib # destroy zipimport <<< 44071 1727204589.07869: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 44071 1727204589.07882: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 44071 1727204589.07906: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale <<< 44071 1727204589.07917: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal <<< 44071 1727204589.07936: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid<<< 44071 1727204589.07943: stdout chunk (state=3): >>> <<< 44071 1727204589.07984: stdout chunk (state=3): >>># destroy _hashlib <<< 44071 1727204589.07990: stdout chunk (state=3): >>># destroy _blake2 # destroy selinux # destroy shutil <<< 44071 1727204589.08012: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 44071 1727204589.08064: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 44071 1727204589.08085: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle <<< 44071 1727204589.08089: stdout chunk (state=3): >>># destroy _compat_pickle <<< 44071 1727204589.08105: stdout chunk (state=3): >>># destroy _pickle <<< 44071 1727204589.08114: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue <<< 44071 1727204589.08131: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors <<< 44071 1727204589.08142: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 44071 1727204589.08162: stdout chunk (state=3): >>># destroy datetime <<< 44071 1727204589.08169: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 44071 1727204589.08191: stdout chunk (state=3): >>># destroy _ssl <<< 44071 1727204589.08222: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd <<< 44071 1727204589.08225: stdout chunk (state=3): >>># destroy termios <<< 44071 1727204589.08242: stdout chunk (state=3): >>># destroy json <<< 44071 1727204589.08269: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 44071 1727204589.08291: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing <<< 44071 1727204589.08294: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection <<< 44071 1727204589.08301: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 44071 1727204589.08307: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 44071 1727204589.08368: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna <<< 44071 1727204589.08382: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 44071 1727204589.08397: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime <<< 44071 1727204589.08408: stdout chunk (state=3): >>># cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 44071 1727204589.08420: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 44071 1727204589.08446: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 44071 1727204589.08472: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 44071 1727204589.08480: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 44071 1727204589.08509: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser<<< 44071 1727204589.08516: stdout chunk (state=3): >>> # cleanup[3] wiping _sre # cleanup[3] wiping functools<<< 44071 1727204589.08539: stdout chunk (state=3): >>> <<< 44071 1727204589.08544: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator <<< 44071 1727204589.08547: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 44071 1727204589.08585: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io <<< 44071 1727204589.08601: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 44071 1727204589.08606: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 44071 1727204589.08631: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 44071 1727204589.08638: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 44071 1727204589.08812: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 44071 1727204589.08835: stdout chunk (state=3): >>># destroy _collections <<< 44071 1727204589.08861: stdout chunk (state=3): >>># destroy platform <<< 44071 1727204589.08874: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 44071 1727204589.08882: stdout chunk (state=3): >>># destroy tokenize <<< 44071 1727204589.08910: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 44071 1727204589.08949: stdout chunk (state=3): >>># destroy _typing <<< 44071 1727204589.08952: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 44071 1727204589.08963: stdout chunk (state=3): >>># destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 44071 1727204589.08978: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 44071 1727204589.09007: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 44071 1727204589.09108: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig<<< 44071 1727204589.09112: stdout chunk (state=3): >>> # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 44071 1727204589.09129: stdout chunk (state=3): >>># destroy time <<< 44071 1727204589.09148: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 44071 1727204589.09183: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re <<< 44071 1727204589.09209: stdout chunk (state=3): >>># destroy itertools <<< 44071 1727204589.09213: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 44071 1727204589.09238: stdout chunk (state=3): >>># clear sys.audit hooks <<< 44071 1727204589.09678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204589.09741: stderr chunk (state=3): >>><<< 44071 1727204589.09745: stdout chunk (state=3): >>><<< 44071 1727204589.09854: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bd118530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bd0e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bd11aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf0d190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf0e090> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf4be60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf4bf20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf83830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf83ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf63b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf61250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf49010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfa7830> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfa6450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf62120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfa4bf0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfd88c0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf48290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcfd8d70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfd8c20> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcfd8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcf46db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfd96a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfd9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfda5a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcff47d0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcff5f10> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcff6db0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcff73e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcff6300> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcff7e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcff7590> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfda600> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcd2bdd0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcd58830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd58590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcd587d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcd589e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd29f70> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd5a0f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd58d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcfdacf0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd7e480> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd9a5d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcdcf380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcdf9b20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcdcf4a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd9b260> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcbd4500> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd99610> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcd5b020> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff0bcd99730> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_zgl39j0q/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc3e180> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc15070> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc14200> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc17590> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcc71be0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc71970> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc71280> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc71ca0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc3ee10> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcc72960> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcc72ba0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcc730b0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcad4e90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcad6ab0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcad73e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcad85c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcadb020> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcadb140> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcad92e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcadef60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcadda30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcadd790> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcadfd70> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcad97f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb23020> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb23290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb28d40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb28b00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb2b230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb293a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb329c0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb2b350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb33830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb336b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb33b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb23440> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb37380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb38770> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb35b20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bcb36ed0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb357f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bc9c07d0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bc9c1550> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcb3bdd0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bc9c1250> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bc9c3e60> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bc9ca0c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bc9ca990> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bcdcf410> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bc9c9760> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bc9caae0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bca62d50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bc9d4b60> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bc9d2c90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bc9d29f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bca65af0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbf50290> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bbf505c0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bca452b0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bca441a0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bca641d0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bca64ce0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bbf535f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbf52ea0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bbf53080> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbf522d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbf53680> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bbfbe1b0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbfbc1d0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bca65280> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbfbe450> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbfbf0b0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bbfea5a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbfd6c60> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bbe05ee0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbfea390> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff0bbe2eae0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbe2d250> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbe2cb60> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbe74ef0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbe74ec0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbe76480> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff0bbe75f40> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3050, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 666, "free": 3050}, "nocache": {"free": 3497, "used": 219}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 935, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251310804992, "block_size": 4096, "block_total": 64479564, "block_available": 61355177, "block_used": 3124387, "inode_total": 16384000, "inode_available": 16301246, "inode_used": 82754, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_is_chroot": false, "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "09", "epoch": "1727204589", "epoch_int": "1727204589", "date": "2024-09-24", "time": "15:03:09", "iso8601_micro": "2024-09-24T19:03:09.058482Z", "iso8601": "2024-09-24T19:03:09Z", "iso8601_basic": "20240924T150309058482", "iso8601_basic_short": "20240924T150309", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.958984375, "5m": 0.66162109375, "15m": 0.4033203125}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 44071 1727204589.10758: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204587.754917-44077-186011612748606/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204589.10762: _low_level_execute_command(): starting 44071 1727204589.10768: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204587.754917-44077-186011612748606/ > /dev/null 2>&1 && sleep 0' 44071 1727204589.10994: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204589.10998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204589.11001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204589.11049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204589.11053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204589.11064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204589.11138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204589.13087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204589.13150: stderr chunk (state=3): >>><<< 44071 1727204589.13154: stdout chunk (state=3): >>><<< 44071 1727204589.13168: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204589.13177: handler run complete 44071 1727204589.13275: variable 'ansible_facts' from source: unknown 44071 1727204589.13352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204589.13587: variable 'ansible_facts' from source: unknown 44071 1727204589.13649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204589.13742: attempt loop complete, returning result 44071 1727204589.13746: _execute() done 44071 1727204589.13749: dumping result to json 44071 1727204589.13770: done dumping result, returning 44071 1727204589.13782: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-c964-7471-00000000001b] 44071 1727204589.13785: sending task result for task 127b8e07-fff9-c964-7471-00000000001b 44071 1727204589.14076: done sending task result for task 127b8e07-fff9-c964-7471-00000000001b 44071 1727204589.14079: WORKER PROCESS EXITING ok: [managed-node2] 44071 1727204589.14363: no more pending results, returning what we have 44071 1727204589.14366: results queue empty 44071 1727204589.14367: checking for any_errors_fatal 44071 1727204589.14368: done checking for any_errors_fatal 44071 1727204589.14369: checking for max_fail_percentage 44071 1727204589.14370: done checking for max_fail_percentage 44071 1727204589.14370: checking to see if all hosts have failed and the running result is not ok 44071 1727204589.14371: done checking to see if all hosts have failed 44071 1727204589.14372: getting the remaining hosts for this loop 44071 1727204589.14373: done getting the remaining hosts for this loop 44071 1727204589.14376: getting the next task for host managed-node2 44071 1727204589.14380: done getting next task for host managed-node2 44071 1727204589.14382: ^ task is: TASK: meta (flush_handlers) 44071 1727204589.14383: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204589.14386: getting variables 44071 1727204589.14387: in VariableManager get_vars() 44071 1727204589.14405: Calling all_inventory to load vars for managed-node2 44071 1727204589.14408: Calling groups_inventory to load vars for managed-node2 44071 1727204589.14411: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204589.14420: Calling all_plugins_play to load vars for managed-node2 44071 1727204589.14422: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204589.14424: Calling groups_plugins_play to load vars for managed-node2 44071 1727204589.14559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204589.14695: done with get_vars() 44071 1727204589.14704: done getting variables 44071 1727204589.14752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 44071 1727204589.14797: in VariableManager get_vars() 44071 1727204589.14804: Calling all_inventory to load vars for managed-node2 44071 1727204589.14806: Calling groups_inventory to load vars for managed-node2 44071 1727204589.14807: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204589.14811: Calling all_plugins_play to load vars for managed-node2 44071 1727204589.14812: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204589.14814: Calling groups_plugins_play to load vars for managed-node2 44071 1727204589.14925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204589.15051: done with get_vars() 44071 1727204589.15063: done queuing things up, now waiting for results queue to drain 44071 1727204589.15067: results queue empty 44071 1727204589.15068: checking for any_errors_fatal 44071 1727204589.15070: done checking for any_errors_fatal 44071 1727204589.15071: checking for max_fail_percentage 44071 1727204589.15076: done checking for max_fail_percentage 44071 1727204589.15076: checking to see if all hosts have failed and the running result is not ok 44071 1727204589.15077: done checking to see if all hosts have failed 44071 1727204589.15077: getting the remaining hosts for this loop 44071 1727204589.15078: done getting the remaining hosts for this loop 44071 1727204589.15080: getting the next task for host managed-node2 44071 1727204589.15084: done getting next task for host managed-node2 44071 1727204589.15089: ^ task is: TASK: Include the task 'el_repo_setup.yml' 44071 1727204589.15090: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204589.15092: getting variables 44071 1727204589.15092: in VariableManager get_vars() 44071 1727204589.15099: Calling all_inventory to load vars for managed-node2 44071 1727204589.15100: Calling groups_inventory to load vars for managed-node2 44071 1727204589.15102: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204589.15105: Calling all_plugins_play to load vars for managed-node2 44071 1727204589.15107: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204589.15109: Calling groups_plugins_play to load vars for managed-node2 44071 1727204589.15218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204589.15343: done with get_vars() 44071 1727204589.15349: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:11 Tuesday 24 September 2024 15:03:09 -0400 (0:00:01.441) 0:00:01.470 ***** 44071 1727204589.15412: entering _queue_task() for managed-node2/include_tasks 44071 1727204589.15414: Creating lock for include_tasks 44071 1727204589.15683: worker is 1 (out of 1 available) 44071 1727204589.15697: exiting _queue_task() for managed-node2/include_tasks 44071 1727204589.15709: done queuing things up, now waiting for results queue to drain 44071 1727204589.15711: waiting for pending results... 44071 1727204589.15857: running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' 44071 1727204589.15926: in run() - task 127b8e07-fff9-c964-7471-000000000006 44071 1727204589.15943: variable 'ansible_search_path' from source: unknown 44071 1727204589.15974: calling self._execute() 44071 1727204589.16035: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204589.16045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204589.16052: variable 'omit' from source: magic vars 44071 1727204589.16137: _execute() done 44071 1727204589.16143: dumping result to json 44071 1727204589.16146: done dumping result, returning 44071 1727204589.16155: done running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' [127b8e07-fff9-c964-7471-000000000006] 44071 1727204589.16158: sending task result for task 127b8e07-fff9-c964-7471-000000000006 44071 1727204589.16257: done sending task result for task 127b8e07-fff9-c964-7471-000000000006 44071 1727204589.16259: WORKER PROCESS EXITING 44071 1727204589.16311: no more pending results, returning what we have 44071 1727204589.16317: in VariableManager get_vars() 44071 1727204589.16350: Calling all_inventory to load vars for managed-node2 44071 1727204589.16353: Calling groups_inventory to load vars for managed-node2 44071 1727204589.16357: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204589.16371: Calling all_plugins_play to load vars for managed-node2 44071 1727204589.16374: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204589.16378: Calling groups_plugins_play to load vars for managed-node2 44071 1727204589.16528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204589.16676: done with get_vars() 44071 1727204589.16685: variable 'ansible_search_path' from source: unknown 44071 1727204589.16696: we have included files to process 44071 1727204589.16697: generating all_blocks data 44071 1727204589.16698: done generating all_blocks data 44071 1727204589.16699: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 44071 1727204589.16700: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 44071 1727204589.16702: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 44071 1727204589.17187: in VariableManager get_vars() 44071 1727204589.17199: done with get_vars() 44071 1727204589.17207: done processing included file 44071 1727204589.17209: iterating over new_blocks loaded from include file 44071 1727204589.17210: in VariableManager get_vars() 44071 1727204589.17216: done with get_vars() 44071 1727204589.17217: filtering new block on tags 44071 1727204589.17230: done filtering new block on tags 44071 1727204589.17232: in VariableManager get_vars() 44071 1727204589.17240: done with get_vars() 44071 1727204589.17241: filtering new block on tags 44071 1727204589.17251: done filtering new block on tags 44071 1727204589.17252: in VariableManager get_vars() 44071 1727204589.17259: done with get_vars() 44071 1727204589.17260: filtering new block on tags 44071 1727204589.17270: done filtering new block on tags 44071 1727204589.17272: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node2 44071 1727204589.17277: extending task lists for all hosts with included blocks 44071 1727204589.17309: done extending task lists 44071 1727204589.17309: done processing included files 44071 1727204589.17310: results queue empty 44071 1727204589.17310: checking for any_errors_fatal 44071 1727204589.17311: done checking for any_errors_fatal 44071 1727204589.17312: checking for max_fail_percentage 44071 1727204589.17312: done checking for max_fail_percentage 44071 1727204589.17313: checking to see if all hosts have failed and the running result is not ok 44071 1727204589.17313: done checking to see if all hosts have failed 44071 1727204589.17314: getting the remaining hosts for this loop 44071 1727204589.17315: done getting the remaining hosts for this loop 44071 1727204589.17316: getting the next task for host managed-node2 44071 1727204589.17319: done getting next task for host managed-node2 44071 1727204589.17320: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 44071 1727204589.17322: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204589.17323: getting variables 44071 1727204589.17324: in VariableManager get_vars() 44071 1727204589.17330: Calling all_inventory to load vars for managed-node2 44071 1727204589.17332: Calling groups_inventory to load vars for managed-node2 44071 1727204589.17334: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204589.17339: Calling all_plugins_play to load vars for managed-node2 44071 1727204589.17341: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204589.17343: Calling groups_plugins_play to load vars for managed-node2 44071 1727204589.17457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204589.17586: done with get_vars() 44071 1727204589.17594: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 15:03:09 -0400 (0:00:00.022) 0:00:01.492 ***** 44071 1727204589.17644: entering _queue_task() for managed-node2/setup 44071 1727204589.17975: worker is 1 (out of 1 available) 44071 1727204589.17989: exiting _queue_task() for managed-node2/setup 44071 1727204589.18004: done queuing things up, now waiting for results queue to drain 44071 1727204589.18005: waiting for pending results... 44071 1727204589.18387: running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 44071 1727204589.18393: in run() - task 127b8e07-fff9-c964-7471-00000000002c 44071 1727204589.18397: variable 'ansible_search_path' from source: unknown 44071 1727204589.18402: variable 'ansible_search_path' from source: unknown 44071 1727204589.18451: calling self._execute() 44071 1727204589.18548: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204589.18567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204589.18584: variable 'omit' from source: magic vars 44071 1727204589.19204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204589.21609: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204589.21699: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204589.21968: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204589.21988: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204589.22024: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204589.22126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204589.22167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204589.22208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204589.22259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204589.22285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204589.22487: variable 'ansible_facts' from source: unknown 44071 1727204589.22574: variable 'network_test_required_facts' from source: task vars 44071 1727204589.22626: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 44071 1727204589.22643: variable 'omit' from source: magic vars 44071 1727204589.22695: variable 'omit' from source: magic vars 44071 1727204589.22744: variable 'omit' from source: magic vars 44071 1727204589.22783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204589.22820: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204589.22848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204589.22877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204589.22965: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204589.22970: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204589.22973: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204589.22975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204589.23050: Set connection var ansible_connection to ssh 44071 1727204589.23064: Set connection var ansible_timeout to 10 44071 1727204589.23083: Set connection var ansible_pipelining to False 44071 1727204589.23094: Set connection var ansible_shell_type to sh 44071 1727204589.23104: Set connection var ansible_shell_executable to /bin/sh 44071 1727204589.23111: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204589.23131: variable 'ansible_shell_executable' from source: unknown 44071 1727204589.23134: variable 'ansible_connection' from source: unknown 44071 1727204589.23139: variable 'ansible_module_compression' from source: unknown 44071 1727204589.23142: variable 'ansible_shell_type' from source: unknown 44071 1727204589.23145: variable 'ansible_shell_executable' from source: unknown 44071 1727204589.23147: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204589.23152: variable 'ansible_pipelining' from source: unknown 44071 1727204589.23155: variable 'ansible_timeout' from source: unknown 44071 1727204589.23159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204589.23284: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204589.23292: variable 'omit' from source: magic vars 44071 1727204589.23303: starting attempt loop 44071 1727204589.23306: running the handler 44071 1727204589.23315: _low_level_execute_command(): starting 44071 1727204589.23322: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204589.23873: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204589.23878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204589.23881: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204589.23884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204589.23934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204589.23937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204589.23940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204589.24027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204589.25819: stdout chunk (state=3): >>>/root <<< 44071 1727204589.25924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204589.25988: stderr chunk (state=3): >>><<< 44071 1727204589.25992: stdout chunk (state=3): >>><<< 44071 1727204589.26011: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204589.26024: _low_level_execute_command(): starting 44071 1727204589.26034: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204589.260112-44146-236038060253292 `" && echo ansible-tmp-1727204589.260112-44146-236038060253292="` echo /root/.ansible/tmp/ansible-tmp-1727204589.260112-44146-236038060253292 `" ) && sleep 0' 44071 1727204589.26537: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204589.26541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204589.26544: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204589.26546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204589.26548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204589.26611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204589.26614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204589.26616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204589.26684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204589.28739: stdout chunk (state=3): >>>ansible-tmp-1727204589.260112-44146-236038060253292=/root/.ansible/tmp/ansible-tmp-1727204589.260112-44146-236038060253292 <<< 44071 1727204589.28846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204589.28913: stderr chunk (state=3): >>><<< 44071 1727204589.28916: stdout chunk (state=3): >>><<< 44071 1727204589.28936: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204589.260112-44146-236038060253292=/root/.ansible/tmp/ansible-tmp-1727204589.260112-44146-236038060253292 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204589.28984: variable 'ansible_module_compression' from source: unknown 44071 1727204589.29023: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44071 1727204589.29076: variable 'ansible_facts' from source: unknown 44071 1727204589.29211: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204589.260112-44146-236038060253292/AnsiballZ_setup.py 44071 1727204589.29330: Sending initial data 44071 1727204589.29334: Sent initial data (153 bytes) 44071 1727204589.29838: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204589.29842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204589.29846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204589.29848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204589.29850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204589.29899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204589.29903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204589.29909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204589.29987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204589.31654: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204589.31719: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204589.31795: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbr0nqw49 /root/.ansible/tmp/ansible-tmp-1727204589.260112-44146-236038060253292/AnsiballZ_setup.py <<< 44071 1727204589.31798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204589.260112-44146-236038060253292/AnsiballZ_setup.py" <<< 44071 1727204589.31863: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbr0nqw49" to remote "/root/.ansible/tmp/ansible-tmp-1727204589.260112-44146-236038060253292/AnsiballZ_setup.py" <<< 44071 1727204589.31871: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204589.260112-44146-236038060253292/AnsiballZ_setup.py" <<< 44071 1727204589.33096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204589.33174: stderr chunk (state=3): >>><<< 44071 1727204589.33178: stdout chunk (state=3): >>><<< 44071 1727204589.33201: done transferring module to remote 44071 1727204589.33218: _low_level_execute_command(): starting 44071 1727204589.33225: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204589.260112-44146-236038060253292/ /root/.ansible/tmp/ansible-tmp-1727204589.260112-44146-236038060253292/AnsiballZ_setup.py && sleep 0' 44071 1727204589.33717: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204589.33721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204589.33726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204589.33728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204589.33789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204589.33792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204589.33796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204589.33870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204589.35795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204589.35843: stderr chunk (state=3): >>><<< 44071 1727204589.35846: stdout chunk (state=3): >>><<< 44071 1727204589.35861: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204589.35864: _low_level_execute_command(): starting 44071 1727204589.35871: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204589.260112-44146-236038060253292/AnsiballZ_setup.py && sleep 0' 44071 1727204589.36382: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204589.36386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204589.36388: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204589.36391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204589.36446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204589.36453: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204589.36526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204589.38876: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 44071 1727204589.38894: stdout chunk (state=3): >>>import _imp # builtin <<< 44071 1727204589.38925: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 44071 1727204589.38930: stdout chunk (state=3): >>>import '_weakref' # <<< 44071 1727204589.39003: stdout chunk (state=3): >>>import '_io' # <<< 44071 1727204589.39010: stdout chunk (state=3): >>>import 'marshal' # <<< 44071 1727204589.39041: stdout chunk (state=3): >>>import 'posix' # <<< 44071 1727204589.39081: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 44071 1727204589.39114: stdout chunk (state=3): >>>import 'time' # <<< 44071 1727204589.39120: stdout chunk (state=3): >>>import 'zipimport' # <<< 44071 1727204589.39123: stdout chunk (state=3): >>># installed zipimport hook <<< 44071 1727204589.39191: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204589.39226: stdout chunk (state=3): >>>import '_codecs' # <<< 44071 1727204589.39229: stdout chunk (state=3): >>>import 'codecs' # <<< 44071 1727204589.39256: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 44071 1727204589.39304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd74a8530> <<< 44071 1727204589.39339: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7477b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd74aaab0> <<< 44071 1727204589.39382: stdout chunk (state=3): >>>import '_signal' # <<< 44071 1727204589.39408: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 44071 1727204589.39432: stdout chunk (state=3): >>>import 'io' # <<< 44071 1727204589.39463: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 44071 1727204589.39548: stdout chunk (state=3): >>>import '_collections_abc' # <<< 44071 1727204589.39597: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 44071 1727204589.39600: stdout chunk (state=3): >>>import 'os' # <<< 44071 1727204589.39667: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 44071 1727204589.39688: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 44071 1727204589.39724: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 44071 1727204589.39727: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7259190> <<< 44071 1727204589.39815: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204589.39819: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd725a090> <<< 44071 1727204589.39844: stdout chunk (state=3): >>>import 'site' # <<< 44071 1727204589.39880: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 44071 1727204589.40306: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 44071 1727204589.40310: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 44071 1727204589.40348: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204589.40351: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 44071 1727204589.40393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 44071 1727204589.40409: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 44071 1727204589.40434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 44071 1727204589.40480: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7297f20> <<< 44071 1727204589.40522: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 44071 1727204589.40526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # <<< 44071 1727204589.40583: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72ac0b0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 44071 1727204589.40587: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 44071 1727204589.40596: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 44071 1727204589.40731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72cf920> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72cffb0> <<< 44071 1727204589.40745: stdout chunk (state=3): >>>import '_collections' # <<< 44071 1727204589.40794: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72afbf0> <<< 44071 1727204589.40808: stdout chunk (state=3): >>>import '_functools' # <<< 44071 1727204589.40871: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72ad310> <<< 44071 1727204589.40977: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72950d0> <<< 44071 1727204589.40981: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 44071 1727204589.41330: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 44071 1727204589.41334: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72f3860> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72f2480> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72ae1e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7296fc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd73248c0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7294350> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd7324d70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7324c20> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd7325010> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7292e70> <<< 44071 1727204589.41390: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204589.41393: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 44071 1727204589.41419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 44071 1727204589.41442: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd73256d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd73253a0> import 'importlib.machinery' # <<< 44071 1727204589.41477: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 44071 1727204589.41512: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd73265d0> import 'importlib.util' # <<< 44071 1727204589.41535: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 44071 1727204589.41584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 44071 1727204589.41612: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7340800> <<< 44071 1727204589.41645: stdout chunk (state=3): >>>import 'errno' # <<< 44071 1727204589.41685: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd7341f40> <<< 44071 1727204589.41727: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7342de0> <<< 44071 1727204589.41864: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd7343440> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7342330> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 44071 1727204589.41870: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204589.41887: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd7343e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7343560> <<< 44071 1727204589.41949: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7326630> <<< 44071 1727204589.41979: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 44071 1727204589.41982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 44071 1727204589.42074: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 44071 1727204589.42080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd7077d10> <<< 44071 1727204589.42085: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 44071 1727204589.42257: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd70a07d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70a0530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204589.42261: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd70a0800> <<< 44071 1727204589.42264: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd70a09e0> <<< 44071 1727204589.42280: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7075eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 44071 1727204589.42307: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 44071 1727204589.42330: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 44071 1727204589.42355: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70a2000> <<< 44071 1727204589.42417: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70a0c80> <<< 44071 1727204589.42430: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7326d20> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 44071 1727204589.42499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 44071 1727204589.42541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 44071 1727204589.42583: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70ce3c0> <<< 44071 1727204589.42625: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 44071 1727204589.42694: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204589.42707: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 44071 1727204589.42782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70e6570> <<< 44071 1727204589.42792: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 44071 1727204589.42810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 44071 1727204589.42848: stdout chunk (state=3): >>>import 'ntpath' # <<< 44071 1727204589.42902: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd711f2f0> <<< 44071 1727204589.42951: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 44071 1727204589.42954: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 44071 1727204589.43010: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 44071 1727204589.43043: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 44071 1727204589.43133: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7145a90> <<< 44071 1727204589.43257: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd711f410> <<< 44071 1727204589.43276: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70e7200> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6f20470> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70e55b0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70a2f60> <<< 44071 1727204589.43440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 44071 1727204589.43484: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcdd6f20740> <<< 44071 1727204589.43688: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_3ah2pa6m/ansible_setup_payload.zip' # zipimport: zlib available <<< 44071 1727204589.43818: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 44071 1727204589.43903: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 44071 1727204589.43908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 44071 1727204589.44079: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 44071 1727204589.44101: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6f8e1e0> import '_typing' # <<< 44071 1727204589.44205: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6f650d0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6f64290> # zipimport: zlib available <<< 44071 1727204589.44238: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 44071 1727204589.44256: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.44292: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 44071 1727204589.46010: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.47152: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6f67800> <<< 44071 1727204589.47220: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204589.47238: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 44071 1727204589.47281: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6fbdbe0> <<< 44071 1727204589.47452: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6fbd9a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6fbd2b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6fbda30> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6f8ee70> import 'atexit' # <<< 44071 1727204589.47459: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6fbe8d0> <<< 44071 1727204589.47489: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6fbeb10> <<< 44071 1727204589.47507: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 44071 1727204589.47549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 44071 1727204589.47577: stdout chunk (state=3): >>>import '_locale' # <<< 44071 1727204589.47646: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6fbeff0> <<< 44071 1727204589.47672: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 44071 1727204589.47700: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e24dd0> <<< 44071 1727204589.47730: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e269f0> <<< 44071 1727204589.47899: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e27380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e28560> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 44071 1727204589.47930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 44071 1727204589.47949: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 44071 1727204589.48019: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e2aff0> <<< 44071 1727204589.48141: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e2b2f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e292b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 44071 1727204589.48164: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 44071 1727204589.48189: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 44071 1727204589.48241: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 44071 1727204589.48244: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e2ef00> import '_tokenize' # <<< 44071 1727204589.48377: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e2d9d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e2d730> <<< 44071 1727204589.48385: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 44071 1727204589.48419: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e2fc20> <<< 44071 1727204589.48517: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e297c0> <<< 44071 1727204589.48533: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e72fc0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e73200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 44071 1727204589.48564: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 44071 1727204589.48620: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 44071 1727204589.48693: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e78d40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e78b00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 44071 1727204589.48811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e7b290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e79400> <<< 44071 1727204589.48906: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 44071 1727204589.48921: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 44071 1727204589.48953: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e82a50> <<< 44071 1727204589.49136: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e7b3e0> <<< 44071 1727204589.49191: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e83d70> <<< 44071 1727204589.49254: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e83aa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e83bc0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e73380> <<< 44071 1727204589.49291: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 44071 1727204589.49312: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 44071 1727204589.49366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 44071 1727204589.49370: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204589.49390: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e87470> <<< 44071 1727204589.49562: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204589.49613: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e886b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e85be0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e86f90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e85880> <<< 44071 1727204589.49639: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.49653: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 44071 1727204589.49753: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.49848: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.49882: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 44071 1727204589.49912: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 44071 1727204589.49928: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.50042: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.50169: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.50806: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.51486: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204589.51751: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6d108f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d11730> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e8be60> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 44071 1727204589.51755: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.51757: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 44071 1727204589.51922: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.52151: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d11370> # zipimport: zlib available <<< 44071 1727204589.52637: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.53142: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.53210: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.53294: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 44071 1727204589.53344: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.53378: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 44071 1727204589.53411: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.53453: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.53556: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 44071 1727204589.53589: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 44071 1727204589.53620: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.53647: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.53682: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 44071 1727204589.53714: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.53954: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.54220: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 44071 1727204589.54292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 44071 1727204589.54371: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d13920> <<< 44071 1727204589.54395: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.54462: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.54555: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 44071 1727204589.54587: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 44071 1727204589.54604: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 44071 1727204589.54688: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204589.54818: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6d1a030> <<< 44071 1727204589.54891: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6d1a8a0> <<< 44071 1727204589.54915: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e78320> # zipimport: zlib available <<< 44071 1727204589.54953: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.55013: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 44071 1727204589.55060: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.55276: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.55279: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.55298: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204589.55385: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204589.55399: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6d197c0> <<< 44071 1727204589.55431: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d1a6f0> <<< 44071 1727204589.55469: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 44071 1727204589.55481: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 44071 1727204589.55760: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 44071 1727204589.55778: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 44071 1727204589.55840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 44071 1727204589.55861: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 44071 1727204589.55883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 44071 1727204589.55939: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6dae9c0> <<< 44071 1727204589.55989: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d245f0> <<< 44071 1727204589.56080: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d227e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d19610> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 44071 1727204589.56098: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.56116: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.56147: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 44071 1727204589.56211: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 44071 1727204589.56240: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 44071 1727204589.56259: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.56335: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.56399: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.56436: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.56439: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.56496: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.56531: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.56574: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.56613: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 44071 1727204589.56628: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.56711: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.56788: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.56813: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.56868: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 44071 1727204589.56871: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.57068: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.57262: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.57306: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.57360: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204589.57393: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 44071 1727204589.57418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 44071 1727204589.57433: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 44071 1727204589.57451: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 44071 1727204589.57475: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6db1850> <<< 44071 1727204589.57503: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 44071 1727204589.57535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 44071 1727204589.57538: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 44071 1727204589.57581: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 44071 1727204589.57623: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 44071 1727204589.57643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd630bf50> <<< 44071 1727204589.57677: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204589.57689: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6310290> <<< 44071 1727204589.57745: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d8d010> <<< 44071 1727204589.57770: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d8e3c0> <<< 44071 1727204589.57831: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6db3e60> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6db3740> <<< 44071 1727204589.57834: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 44071 1727204589.57924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 44071 1727204589.57947: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 44071 1727204589.57952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 44071 1727204589.57979: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 44071 1727204589.58014: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204589.58038: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6313380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6312c30> <<< 44071 1727204589.58057: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6312de0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6312060> <<< 44071 1727204589.58089: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 44071 1727204589.58225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 44071 1727204589.58228: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6313440> <<< 44071 1727204589.58240: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 44071 1727204589.58273: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 44071 1727204589.58310: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd637df70> <<< 44071 1727204589.58340: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6313f50> <<< 44071 1727204589.58382: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6db3a40> import 'ansible.module_utils.facts.timeout' # <<< 44071 1727204589.58416: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 44071 1727204589.58439: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 44071 1727204589.58449: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 44071 1727204589.58532: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.58590: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 44071 1727204589.58595: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.58656: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.58708: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 44071 1727204589.58736: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.58759: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 44071 1727204589.58780: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.58792: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.58823: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 44071 1727204589.58833: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.58888: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.58951: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 44071 1727204589.58954: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.59000: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.59044: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 44071 1727204589.59055: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.59118: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.59190: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.59248: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.59322: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 44071 1727204589.59333: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 44071 1727204589.59902: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.60391: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 44071 1727204589.60403: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.60435: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.60503: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.60535: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.60578: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 44071 1727204589.60594: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.60618: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.60667: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 44071 1727204589.60727: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.60796: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 44071 1727204589.60807: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.60851: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.60900: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 44071 1727204589.60916: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.60951: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 44071 1727204589.60960: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.61048: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.61147: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 44071 1727204589.61172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd637e1b0> <<< 44071 1727204589.61193: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 44071 1727204589.61225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 44071 1727204589.61363: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd637ed80> import 'ansible.module_utils.facts.system.local' # <<< 44071 1727204589.61377: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.61449: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.61520: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 44071 1727204589.61543: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.61633: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.61735: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 44071 1727204589.61744: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.61811: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.61902: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 44071 1727204589.61912: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.61947: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.62006: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 44071 1727204589.62047: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 44071 1727204589.62420: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204589.62424: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd63ae300> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6396180> import 'ansible.module_utils.facts.system.python' # <<< 44071 1727204589.62440: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.62491: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.62550: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 44071 1727204589.62564: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.62650: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.62744: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.62869: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.63025: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 44071 1727204589.63076: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 44071 1727204589.63092: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.63177: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 44071 1727204589.63190: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.63241: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 44071 1727204589.63275: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204589.63308: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6161e80> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd63ad9d0> import 'ansible.module_utils.facts.system.user' # <<< 44071 1727204589.63366: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.63371: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 44071 1727204589.63392: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.63499: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 44071 1727204589.63643: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.63816: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 44071 1727204589.63932: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.64049: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.64095: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.64146: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 44071 1727204589.64161: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 44071 1727204589.64183: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.64246: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.64361: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.64528: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 44071 1727204589.64541: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.64709: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.64823: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 44071 1727204589.64827: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.64852: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.64898: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.65535: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.66129: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 44071 1727204589.66135: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.66238: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.66350: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 44071 1727204589.66380: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.66471: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.66583: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 44071 1727204589.66601: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.66756: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.66944: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 44071 1727204589.66963: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 44071 1727204589.66984: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.67028: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.67077: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 44071 1727204589.67090: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.67197: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.67306: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.67532: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.67822: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available <<< 44071 1727204589.67902: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 44071 1727204589.67906: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.68072: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 44071 1727204589.68122: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available <<< 44071 1727204589.68157: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 44071 1727204589.68160: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.68227: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.68300: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 44071 1727204589.68312: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.68359: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.68429: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 44071 1727204589.68445: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.68731: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.69028: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 44071 1727204589.69059: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.69103: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.69176: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 44071 1727204589.69179: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.69215: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.69259: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 44071 1727204589.69262: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.69296: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.69333: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 44071 1727204589.69351: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.69386: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.69438: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 44071 1727204589.69441: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.69620: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.69624: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 44071 1727204589.69627: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.69808: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 44071 1727204589.69812: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 44071 1727204589.69814: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.69816: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.69855: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.69911: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.70099: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 44071 1727204589.70146: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.70210: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 44071 1727204589.70429: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.70646: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 44071 1727204589.70671: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.70705: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.70758: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 44071 1727204589.70781: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.70822: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.70871: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 44071 1727204589.70893: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.70974: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.71070: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 44071 1727204589.71094: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.71181: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.71357: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 44071 1727204589.71535: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204589.71564: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 44071 1727204589.71586: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 44071 1727204589.71597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 44071 1727204589.71628: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd618a960> <<< 44071 1727204589.71654: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd618b440> <<< 44071 1727204589.71701: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6184e30> <<< 44071 1727204589.72875: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_fips": false, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_service_mgr": "systemd", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "09", "epoch": "1727204589", "epoch_int": "1727204589", "date": "2024-09-24", "time": "15:03:09", "iso8601_micro": "2024-09-24T19:03:09.726260Z", "iso8601": "2024-09-24T19:03:09Z", "iso8601_basic": "20240924T150309726260", "iso8601_basic_short": "20240924T150309", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44071 1727204589.73519: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools <<< 44071 1727204589.73636: stdout chunk (state=3): >>># cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing <<< 44071 1727204589.73657: stdout chunk (state=3): >>># cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter <<< 44071 1727204589.73714: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd <<< 44071 1727204589.73798: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg <<< 44071 1727204589.73814: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 44071 1727204589.74064: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 44071 1727204589.74324: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 44071 1727204589.74327: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 44071 1727204589.74371: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 44071 1727204589.74410: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 44071 1727204589.74439: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 44071 1727204589.74475: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess <<< 44071 1727204589.74491: stdout chunk (state=3): >>># destroy base64 # destroy _ssl <<< 44071 1727204589.74525: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 44071 1727204589.74568: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 44071 1727204589.74572: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 44071 1727204589.74611: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep<<< 44071 1727204589.74662: stdout chunk (state=3): >>> # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 44071 1727204589.74699: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 44071 1727204589.74730: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator <<< 44071 1727204589.74875: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external <<< 44071 1727204589.74878: stdout chunk (state=3): >>># cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 44071 1727204589.74942: stdout chunk (state=3): >>># destroy sys.monitoring <<< 44071 1727204589.74985: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 44071 1727204589.75002: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 44071 1727204589.75061: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 44071 1727204589.75087: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 44071 1727204589.75227: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 44071 1727204589.75232: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8<<< 44071 1727204589.75235: stdout chunk (state=3): >>> # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 44071 1727204589.75255: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 44071 1727204589.75305: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 <<< 44071 1727204589.75381: stdout chunk (state=3): >>># destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 44071 1727204589.75400: stdout chunk (state=3): >>># clear sys.audit hooks <<< 44071 1727204589.75870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204589.76223: stderr chunk (state=3): >>><<< 44071 1727204589.76226: stdout chunk (state=3): >>><<< 44071 1727204589.76287: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd74a8530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7477b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd74aaab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7259190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd725a090> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7297f20> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72ac0b0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72cf920> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72cffb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72afbf0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72ad310> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72950d0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72f3860> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72f2480> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd72ae1e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7296fc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd73248c0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7294350> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd7324d70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7324c20> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd7325010> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7292e70> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd73256d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd73253a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd73265d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7340800> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd7341f40> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7342de0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd7343440> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7342330> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd7343e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7343560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7326630> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd7077d10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd70a07d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70a0530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd70a0800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd70a09e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7075eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70a2000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70a0c80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7326d20> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70ce3c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70e6570> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd711f2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd7145a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd711f410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70e7200> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6f20470> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70e55b0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd70a2f60> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcdd6f20740> # zipimport: found 103 names in '/tmp/ansible_setup_payload_3ah2pa6m/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6f8e1e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6f650d0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6f64290> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6f67800> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6fbdbe0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6fbd9a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6fbd2b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6fbda30> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6f8ee70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6fbe8d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6fbeb10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6fbeff0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e24dd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e269f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e27380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e28560> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e2aff0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e2b2f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e292b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e2ef00> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e2d9d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e2d730> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e2fc20> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e297c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e72fc0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e73200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e78d40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e78b00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e7b290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e79400> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e82a50> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e7b3e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e83d70> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e83aa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e83bc0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e73380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e87470> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e886b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e85be0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6e86f90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e85880> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6d108f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d11730> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e8be60> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d11370> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d13920> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6d1a030> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6d1a8a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6e78320> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6d197c0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d1a6f0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6dae9c0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d245f0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d227e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d19610> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6db1850> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd630bf50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6310290> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d8d010> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6d8e3c0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6db3e60> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6db3740> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6313380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6312c30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6312de0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6312060> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6313440> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd637df70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6313f50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6db3a40> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd637e1b0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd637ed80> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd63ae300> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6396180> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd6161e80> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd63ad9d0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdd618a960> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd618b440> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdd6184e30> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_fips": false, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_service_mgr": "systemd", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "09", "epoch": "1727204589", "epoch_int": "1727204589", "date": "2024-09-24", "time": "15:03:09", "iso8601_micro": "2024-09-24T19:03:09.726260Z", "iso8601": "2024-09-24T19:03:09Z", "iso8601_basic": "20240924T150309726260", "iso8601_basic_short": "20240924T150309", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 44071 1727204589.78419: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204589.260112-44146-236038060253292/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204589.78422: _low_level_execute_command(): starting 44071 1727204589.78425: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204589.260112-44146-236038060253292/ > /dev/null 2>&1 && sleep 0' 44071 1727204589.78822: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204589.78831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204589.78837: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204589.78845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204589.78847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204589.78884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204589.78908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204589.79269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204589.79405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204589.81414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204589.81694: stderr chunk (state=3): >>><<< 44071 1727204589.81698: stdout chunk (state=3): >>><<< 44071 1727204589.81701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204589.81704: handler run complete 44071 1727204589.81706: variable 'ansible_facts' from source: unknown 44071 1727204589.81910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204589.82158: variable 'ansible_facts' from source: unknown 44071 1727204589.82318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204589.82576: attempt loop complete, returning result 44071 1727204589.82580: _execute() done 44071 1727204589.82582: dumping result to json 44071 1727204589.82584: done dumping result, returning 44071 1727204589.82587: done running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [127b8e07-fff9-c964-7471-00000000002c] 44071 1727204589.82589: sending task result for task 127b8e07-fff9-c964-7471-00000000002c ok: [managed-node2] 44071 1727204589.83023: no more pending results, returning what we have 44071 1727204589.83026: results queue empty 44071 1727204589.83027: checking for any_errors_fatal 44071 1727204589.83029: done checking for any_errors_fatal 44071 1727204589.83030: checking for max_fail_percentage 44071 1727204589.83031: done checking for max_fail_percentage 44071 1727204589.83032: checking to see if all hosts have failed and the running result is not ok 44071 1727204589.83033: done checking to see if all hosts have failed 44071 1727204589.83034: getting the remaining hosts for this loop 44071 1727204589.83035: done getting the remaining hosts for this loop 44071 1727204589.83041: getting the next task for host managed-node2 44071 1727204589.83050: done getting next task for host managed-node2 44071 1727204589.83052: ^ task is: TASK: Check if system is ostree 44071 1727204589.83055: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204589.83059: getting variables 44071 1727204589.83061: in VariableManager get_vars() 44071 1727204589.83401: Calling all_inventory to load vars for managed-node2 44071 1727204589.83405: Calling groups_inventory to load vars for managed-node2 44071 1727204589.83409: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204589.83422: Calling all_plugins_play to load vars for managed-node2 44071 1727204589.83425: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204589.83429: Calling groups_plugins_play to load vars for managed-node2 44071 1727204589.83981: done sending task result for task 127b8e07-fff9-c964-7471-00000000002c 44071 1727204589.83986: WORKER PROCESS EXITING 44071 1727204589.84021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204589.84588: done with get_vars() 44071 1727204589.84605: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 15:03:09 -0400 (0:00:00.671) 0:00:02.164 ***** 44071 1727204589.84839: entering _queue_task() for managed-node2/stat 44071 1727204589.85594: worker is 1 (out of 1 available) 44071 1727204589.85606: exiting _queue_task() for managed-node2/stat 44071 1727204589.85620: done queuing things up, now waiting for results queue to drain 44071 1727204589.85622: waiting for pending results... 44071 1727204589.86126: running TaskExecutor() for managed-node2/TASK: Check if system is ostree 44071 1727204589.86303: in run() - task 127b8e07-fff9-c964-7471-00000000002e 44071 1727204589.86440: variable 'ansible_search_path' from source: unknown 44071 1727204589.86470: variable 'ansible_search_path' from source: unknown 44071 1727204589.86572: calling self._execute() 44071 1727204589.86747: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204589.86762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204589.86778: variable 'omit' from source: magic vars 44071 1727204589.87917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204589.88547: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204589.88820: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204589.88825: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204589.88913: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204589.89134: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204589.89234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204589.89362: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204589.89382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204589.89795: Evaluated conditional (not __network_is_ostree is defined): True 44071 1727204589.89798: variable 'omit' from source: magic vars 44071 1727204589.89801: variable 'omit' from source: magic vars 44071 1727204589.89946: variable 'omit' from source: magic vars 44071 1727204589.90046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204589.90171: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204589.90206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204589.90447: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204589.90451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204589.90453: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204589.90456: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204589.90459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204589.90705: Set connection var ansible_connection to ssh 44071 1727204589.90719: Set connection var ansible_timeout to 10 44071 1727204589.90730: Set connection var ansible_pipelining to False 44071 1727204589.90783: Set connection var ansible_shell_type to sh 44071 1727204589.90795: Set connection var ansible_shell_executable to /bin/sh 44071 1727204589.90814: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204589.90907: variable 'ansible_shell_executable' from source: unknown 44071 1727204589.91024: variable 'ansible_connection' from source: unknown 44071 1727204589.91028: variable 'ansible_module_compression' from source: unknown 44071 1727204589.91031: variable 'ansible_shell_type' from source: unknown 44071 1727204589.91033: variable 'ansible_shell_executable' from source: unknown 44071 1727204589.91035: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204589.91038: variable 'ansible_pipelining' from source: unknown 44071 1727204589.91040: variable 'ansible_timeout' from source: unknown 44071 1727204589.91043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204589.91841: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204589.91904: variable 'omit' from source: magic vars 44071 1727204589.91916: starting attempt loop 44071 1727204589.92034: running the handler 44071 1727204589.92038: _low_level_execute_command(): starting 44071 1727204589.92040: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204589.94114: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204589.94212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204589.94326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204589.96129: stdout chunk (state=3): >>>/root <<< 44071 1727204589.96328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204589.96332: stdout chunk (state=3): >>><<< 44071 1727204589.96334: stderr chunk (state=3): >>><<< 44071 1727204589.96655: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204589.96669: _low_level_execute_command(): starting 44071 1727204589.96674: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204589.9654627-44248-220861030854526 `" && echo ansible-tmp-1727204589.9654627-44248-220861030854526="` echo /root/.ansible/tmp/ansible-tmp-1727204589.9654627-44248-220861030854526 `" ) && sleep 0' 44071 1727204589.97936: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204589.97941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204589.98114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204589.98130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204589.98358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204590.00268: stdout chunk (state=3): >>>ansible-tmp-1727204589.9654627-44248-220861030854526=/root/.ansible/tmp/ansible-tmp-1727204589.9654627-44248-220861030854526 <<< 44071 1727204590.00594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204590.00674: stderr chunk (state=3): >>><<< 44071 1727204590.00935: stdout chunk (state=3): >>><<< 44071 1727204590.00940: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204589.9654627-44248-220861030854526=/root/.ansible/tmp/ansible-tmp-1727204589.9654627-44248-220861030854526 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204590.00942: variable 'ansible_module_compression' from source: unknown 44071 1727204590.00996: ANSIBALLZ: Using lock for stat 44071 1727204590.01051: ANSIBALLZ: Acquiring lock 44071 1727204590.01059: ANSIBALLZ: Lock acquired: 140077513493680 44071 1727204590.01071: ANSIBALLZ: Creating module 44071 1727204590.35812: ANSIBALLZ: Writing module into payload 44071 1727204590.35863: ANSIBALLZ: Writing module 44071 1727204590.35895: ANSIBALLZ: Renaming module 44071 1727204590.35905: ANSIBALLZ: Done creating module 44071 1727204590.35936: variable 'ansible_facts' from source: unknown 44071 1727204590.36027: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204589.9654627-44248-220861030854526/AnsiballZ_stat.py 44071 1727204590.36251: Sending initial data 44071 1727204590.36260: Sent initial data (153 bytes) 44071 1727204590.37026: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204590.37121: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204590.37173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204590.37193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204590.37220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204590.37360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204590.39312: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204590.39371: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204590.39447: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp8srof1n2 /root/.ansible/tmp/ansible-tmp-1727204589.9654627-44248-220861030854526/AnsiballZ_stat.py <<< 44071 1727204590.39451: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204589.9654627-44248-220861030854526/AnsiballZ_stat.py" <<< 44071 1727204590.39715: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp8srof1n2" to remote "/root/.ansible/tmp/ansible-tmp-1727204589.9654627-44248-220861030854526/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204589.9654627-44248-220861030854526/AnsiballZ_stat.py" <<< 44071 1727204590.40987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204590.41136: stderr chunk (state=3): >>><<< 44071 1727204590.41338: stdout chunk (state=3): >>><<< 44071 1727204590.41341: done transferring module to remote 44071 1727204590.41343: _low_level_execute_command(): starting 44071 1727204590.41345: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204589.9654627-44248-220861030854526/ /root/.ansible/tmp/ansible-tmp-1727204589.9654627-44248-220861030854526/AnsiballZ_stat.py && sleep 0' 44071 1727204590.42875: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204590.42880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204590.42883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204590.43139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204590.43153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204590.43300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204590.43397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204590.45379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204590.45472: stderr chunk (state=3): >>><<< 44071 1727204590.45484: stdout chunk (state=3): >>><<< 44071 1727204590.45552: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204590.45562: _low_level_execute_command(): starting 44071 1727204590.45672: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204589.9654627-44248-220861030854526/AnsiballZ_stat.py && sleep 0' 44071 1727204590.46752: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204590.46757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204590.46760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204590.47119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204590.47471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204590.49827: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # <<< 44071 1727204590.49857: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 44071 1727204590.49902: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204590.49925: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 44071 1727204590.49969: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 44071 1727204590.49998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b75a4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7573b30> <<< 44071 1727204590.50083: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 44071 1727204590.50110: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b75a6ab0> import '_signal' # import '_abc' # import 'abc' # <<< 44071 1727204590.50124: stdout chunk (state=3): >>>import 'io' # <<< 44071 1727204590.50192: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 44071 1727204590.50245: stdout chunk (state=3): >>>import '_collections_abc' # <<< 44071 1727204590.50282: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 44071 1727204590.50314: stdout chunk (state=3): >>>import 'os' # <<< 44071 1727204590.50378: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 44071 1727204590.50450: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73991c0> <<< 44071 1727204590.50622: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204590.50650: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b739a0c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 44071 1727204590.50815: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 44071 1727204590.50916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 44071 1727204590.50993: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 44071 1727204590.51033: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73d7fb0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 44071 1727204590.51094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73ec140> <<< 44071 1727204590.51301: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 44071 1727204590.51309: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 44071 1727204590.51314: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 44071 1727204590.51316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204590.51319: stdout chunk (state=3): >>>import 'itertools' # <<< 44071 1727204590.51321: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 44071 1727204590.51323: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b740f950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 44071 1727204590.51325: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b740ffe0> <<< 44071 1727204590.51429: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73efc20> import '_functools' # <<< 44071 1727204590.51432: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73ed3a0> <<< 44071 1727204590.51460: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73d5160> <<< 44071 1727204590.51506: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 44071 1727204590.51510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 44071 1727204590.51628: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 44071 1727204590.51644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b74338f0> <<< 44071 1727204590.51671: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7432510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 44071 1727204590.51684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73ee240> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7430d70> <<< 44071 1727204590.51776: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 44071 1727204590.51780: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7460980> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73d43e0> <<< 44071 1727204590.51861: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7460e30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7460ce0> <<< 44071 1727204590.51959: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b74610d0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73d2f00> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204590.51992: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b74617c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7461490> import 'importlib.machinery' # <<< 44071 1727204590.52099: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 44071 1727204590.52169: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b74626c0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b747c8c0> <<< 44071 1727204590.52173: stdout chunk (state=3): >>>import 'errno' # <<< 44071 1727204590.52198: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b747dfd0> <<< 44071 1727204590.52285: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 44071 1727204590.52289: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b747ee10> <<< 44071 1727204590.52440: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b747f440> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b747e360> <<< 44071 1727204590.52447: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 44071 1727204590.52450: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b747fe00> <<< 44071 1727204590.52644: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b747f530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b74626f0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7243cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204590.52648: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b726c800> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b726c560> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b726c830> <<< 44071 1727204590.52692: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b726ca10> <<< 44071 1727204590.52710: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7241e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 44071 1727204590.52815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 44071 1727204590.52914: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b726e0f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b726cd70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7462de0> <<< 44071 1727204590.52953: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 44071 1727204590.52991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204590.53008: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 44071 1727204590.53241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b729a3f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b72b2570> <<< 44071 1727204590.53255: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 44071 1727204590.53284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 44071 1727204590.53347: stdout chunk (state=3): >>>import 'ntpath' # <<< 44071 1727204590.53373: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b72eb350> <<< 44071 1727204590.53421: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 44071 1727204590.53446: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 44071 1727204590.53493: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 44071 1727204590.53590: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7311af0> <<< 44071 1727204590.53660: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b72eb470> <<< 44071 1727204590.53701: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b72b3200> <<< 44071 1727204590.53746: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b70f0500> <<< 44071 1727204590.53761: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b72b15b0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b726ef90> <<< 44071 1727204590.53863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 44071 1727204590.53996: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe6b70f07a0> <<< 44071 1727204590.54010: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_m588eiew/ansible_stat_payload.zip' # zipimport: zlib available <<< 44071 1727204590.54115: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.54144: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 44071 1727204590.54190: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 44071 1727204590.54262: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 44071 1727204590.54298: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b714a300> <<< 44071 1727204590.54324: stdout chunk (state=3): >>>import '_typing' # <<< 44071 1727204590.54511: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b71211f0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7120350> # zipimport: zlib available <<< 44071 1727204590.54615: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 44071 1727204590.56175: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.57478: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 44071 1727204590.57554: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7123710> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7175dc0> <<< 44071 1727204590.57576: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7175b50> <<< 44071 1727204590.57647: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7175460> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 44071 1727204590.57707: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7175940> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b714af90> <<< 44071 1727204590.57748: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7176a20> <<< 44071 1727204590.57853: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7176c00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 44071 1727204590.57857: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 44071 1727204590.57916: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7177050> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 44071 1727204590.57948: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fd8e60> <<< 44071 1727204590.58125: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b6fdaa80> <<< 44071 1727204590.58128: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 44071 1727204590.58131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fdb440> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fdc320> <<< 44071 1727204590.58156: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 44071 1727204590.58195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 44071 1727204590.58198: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 44071 1727204590.58256: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fdf050> <<< 44071 1727204590.58337: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b6fdf1d0> <<< 44071 1727204590.58359: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fdd340> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 44071 1727204590.58500: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 44071 1727204590.58503: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fe2f60> import '_tokenize' # <<< 44071 1727204590.58603: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fe1a30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fe1790> <<< 44071 1727204590.58607: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 44071 1727204590.58818: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fe3e90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fdd7c0> <<< 44071 1727204590.58822: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b702b170> <<< 44071 1727204590.58824: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b702b260> <<< 44071 1727204590.58831: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 44071 1727204590.58835: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 44071 1727204590.58863: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b702ce90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b702cc50> <<< 44071 1727204590.58882: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 44071 1727204590.58991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 44071 1727204590.59033: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 44071 1727204590.59054: stdout chunk (state=3): >>> import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b702f380> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b702d520> <<< 44071 1727204590.59069: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 44071 1727204590.59121: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204590.59142: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 44071 1727204590.59152: stdout chunk (state=3): >>>import '_string' # <<< 44071 1727204590.59198: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7036b40> <<< 44071 1727204590.59333: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b702f4d0> <<< 44071 1727204590.59407: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7037860> <<< 44071 1727204590.59444: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7037bc0> <<< 44071 1727204590.59510: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7037c50> <<< 44071 1727204590.59514: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b702b590> <<< 44071 1727204590.59577: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 44071 1727204590.59585: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 44071 1727204590.59624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 44071 1727204590.59628: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204590.59647: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b703b470> <<< 44071 1727204590.59818: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204590.59834: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b703c650> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7039c10> <<< 44071 1727204590.59870: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204590.59901: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b703afc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7039820> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 44071 1727204590.59921: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.60026: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.60154: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.60158: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 44071 1727204590.60185: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 44071 1727204590.60328: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.60684: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.61063: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.61685: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 44071 1727204590.61713: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 44071 1727204590.61735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204590.61793: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b70c4830> <<< 44071 1727204590.61890: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 44071 1727204590.61910: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b70c5640> <<< 44071 1727204590.61929: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b703f0e0> <<< 44071 1727204590.61967: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 44071 1727204590.61993: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.62005: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.62031: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 44071 1727204590.62198: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.62375: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 44071 1727204590.62396: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b70c5400> # zipimport: zlib available <<< 44071 1727204590.62925: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.63430: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.63506: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.63589: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 44071 1727204590.63606: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.63639: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.63676: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 44071 1727204590.63702: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.63767: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.63858: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 44071 1727204590.63888: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.63907: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 44071 1727204590.63925: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.63956: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.64007: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 44071 1727204590.64010: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.64269: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.64587: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 44071 1727204590.64620: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 44071 1727204590.64880: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b70c6450> # zipimport: zlib available <<< 44071 1727204590.64987: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.65000: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204590.65109: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b6ed2000> <<< 44071 1727204590.65160: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 44071 1727204590.65185: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b6ed28a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b70c72f0> <<< 44071 1727204590.65203: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.65245: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.65292: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 44071 1727204590.65321: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.65348: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.65398: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.65461: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.65674: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 44071 1727204590.65718: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b6ed1700> <<< 44071 1727204590.65892: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6ed29c0> <<< 44071 1727204590.65931: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 44071 1727204590.66207: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 44071 1727204590.66212: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6f62c90> <<< 44071 1727204590.66260: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6edc920> <<< 44071 1727204590.66354: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6edaab0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6eda900> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 44071 1727204590.66381: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.66398: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.66435: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 44071 1727204590.66489: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 44071 1727204590.66512: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.66537: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 44071 1727204590.66697: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.66992: stdout chunk (state=3): >>># zipimport: zlib available <<< 44071 1727204590.67057: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 44071 1727204590.67473: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib <<< 44071 1727204590.67481: stdout chunk (state=3): >>># cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat <<< 44071 1727204590.67494: stdout chunk (state=3): >>># cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre <<< 44071 1727204590.67508: stdout chunk (state=3): >>># cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json <<< 44071 1727204590.67520: stdout chunk (state=3): >>># cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl <<< 44071 1727204590.67527: stdout chunk (state=3): >>># cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid <<< 44071 1727204590.67549: stdout chunk (state=3): >>># cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat <<< 44071 1727204590.67599: stdout chunk (state=3): >>># destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec <<< 44071 1727204590.67614: stdout chunk (state=3): >>># destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 44071 1727204590.67878: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 44071 1727204590.67910: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 44071 1727204590.67939: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 44071 1727204590.68010: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal <<< 44071 1727204590.68017: stdout chunk (state=3): >>># destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 44071 1727204590.68080: stdout chunk (state=3): >>># destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 44071 1727204590.68087: stdout chunk (state=3): >>># destroy selectors # destroy errno <<< 44071 1727204590.68092: stdout chunk (state=3): >>># destroy array <<< 44071 1727204590.68131: stdout chunk (state=3): >>># destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 44071 1727204590.68173: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 44071 1727204590.68235: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 44071 1727204590.68288: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 44071 1727204590.68294: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants <<< 44071 1727204590.68332: stdout chunk (state=3): >>># destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 44071 1727204590.68385: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 44071 1727204590.68399: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 44071 1727204590.68540: stdout chunk (state=3): >>># destroy sys.monitoring <<< 44071 1727204590.68573: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 44071 1727204590.68620: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib <<< 44071 1727204590.68623: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 44071 1727204590.68662: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator <<< 44071 1727204590.68690: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 44071 1727204590.68812: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs <<< 44071 1727204590.68815: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 44071 1727204590.68856: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 44071 1727204590.68859: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re <<< 44071 1727204590.68907: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 44071 1727204590.68910: stdout chunk (state=3): >>># clear sys.audit hooks <<< 44071 1727204590.69412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204590.69416: stdout chunk (state=3): >>><<< 44071 1727204590.69418: stderr chunk (state=3): >>><<< 44071 1727204590.69591: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b75a4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7573b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b75a6ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73991c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b739a0c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73d7fb0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73ec140> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b740f950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b740ffe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73efc20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73ed3a0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73d5160> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b74338f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7432510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73ee240> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7430d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7460980> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73d43e0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7460e30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7460ce0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b74610d0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b73d2f00> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b74617c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7461490> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b74626c0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b747c8c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b747dfd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b747ee10> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b747f440> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b747e360> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b747fe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b747f530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b74626f0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7243cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b726c800> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b726c560> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b726c830> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b726ca10> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7241e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b726e0f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b726cd70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7462de0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b729a3f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b72b2570> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b72eb350> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7311af0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b72eb470> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b72b3200> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b70f0500> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b72b15b0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b726ef90> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe6b70f07a0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_m588eiew/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b714a300> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b71211f0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7120350> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7123710> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7175dc0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7175b50> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7175460> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7175940> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b714af90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7176a20> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7176c00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7177050> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fd8e60> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b6fdaa80> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fdb440> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fdc320> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fdf050> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b6fdf1d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fdd340> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fe2f60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fe1a30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fe1790> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fe3e90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6fdd7c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b702b170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b702b260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b702ce90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b702cc50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b702f380> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b702d520> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7036b40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b702f4d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7037860> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7037bc0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b7037c50> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b702b590> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b703b470> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b703c650> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7039c10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b703afc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b7039820> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b70c4830> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b70c5640> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b703f0e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b70c5400> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b70c6450> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b6ed2000> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b6ed28a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b70c72f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe6b6ed1700> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6ed29c0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6f62c90> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6edc920> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6edaab0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe6b6eda900> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 44071 1727204590.70302: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204589.9654627-44248-220861030854526/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204590.70314: _low_level_execute_command(): starting 44071 1727204590.70317: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204589.9654627-44248-220861030854526/ > /dev/null 2>&1 && sleep 0' 44071 1727204590.70971: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204590.71091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204590.71095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204590.71098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204590.71189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204590.73178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204590.73256: stderr chunk (state=3): >>><<< 44071 1727204590.73270: stdout chunk (state=3): >>><<< 44071 1727204590.73484: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204590.73488: handler run complete 44071 1727204590.73490: attempt loop complete, returning result 44071 1727204590.73493: _execute() done 44071 1727204590.73495: dumping result to json 44071 1727204590.73497: done dumping result, returning 44071 1727204590.73499: done running TaskExecutor() for managed-node2/TASK: Check if system is ostree [127b8e07-fff9-c964-7471-00000000002e] 44071 1727204590.73502: sending task result for task 127b8e07-fff9-c964-7471-00000000002e 44071 1727204590.73575: done sending task result for task 127b8e07-fff9-c964-7471-00000000002e 44071 1727204590.73578: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 44071 1727204590.73657: no more pending results, returning what we have 44071 1727204590.73661: results queue empty 44071 1727204590.73662: checking for any_errors_fatal 44071 1727204590.73672: done checking for any_errors_fatal 44071 1727204590.73673: checking for max_fail_percentage 44071 1727204590.73675: done checking for max_fail_percentage 44071 1727204590.73676: checking to see if all hosts have failed and the running result is not ok 44071 1727204590.73677: done checking to see if all hosts have failed 44071 1727204590.73677: getting the remaining hosts for this loop 44071 1727204590.73679: done getting the remaining hosts for this loop 44071 1727204590.73685: getting the next task for host managed-node2 44071 1727204590.73694: done getting next task for host managed-node2 44071 1727204590.73697: ^ task is: TASK: Set flag to indicate system is ostree 44071 1727204590.73699: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204590.73703: getting variables 44071 1727204590.73705: in VariableManager get_vars() 44071 1727204590.73737: Calling all_inventory to load vars for managed-node2 44071 1727204590.73740: Calling groups_inventory to load vars for managed-node2 44071 1727204590.73744: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204590.73756: Calling all_plugins_play to load vars for managed-node2 44071 1727204590.73760: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204590.73763: Calling groups_plugins_play to load vars for managed-node2 44071 1727204590.74368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204590.74762: done with get_vars() 44071 1727204590.74786: done getting variables 44071 1727204590.74927: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 15:03:10 -0400 (0:00:00.901) 0:00:03.066 ***** 44071 1727204590.74959: entering _queue_task() for managed-node2/set_fact 44071 1727204590.74966: Creating lock for set_fact 44071 1727204590.75335: worker is 1 (out of 1 available) 44071 1727204590.75349: exiting _queue_task() for managed-node2/set_fact 44071 1727204590.75363: done queuing things up, now waiting for results queue to drain 44071 1727204590.75367: waiting for pending results... 44071 1727204590.75742: running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree 44071 1727204590.75753: in run() - task 127b8e07-fff9-c964-7471-00000000002f 44071 1727204590.75769: variable 'ansible_search_path' from source: unknown 44071 1727204590.75772: variable 'ansible_search_path' from source: unknown 44071 1727204590.75809: calling self._execute() 44071 1727204590.75888: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204590.75895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204590.75905: variable 'omit' from source: magic vars 44071 1727204590.76497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204590.76758: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204590.76806: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204590.76844: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204590.76879: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204590.76971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204590.76998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204590.77025: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204590.77055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204590.77188: Evaluated conditional (not __network_is_ostree is defined): True 44071 1727204590.77255: variable 'omit' from source: magic vars 44071 1727204590.77258: variable 'omit' from source: magic vars 44071 1727204590.77373: variable '__ostree_booted_stat' from source: set_fact 44071 1727204590.77424: variable 'omit' from source: magic vars 44071 1727204590.77454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204590.77486: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204590.77506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204590.77524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204590.77536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204590.77572: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204590.77577: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204590.77580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204590.77687: Set connection var ansible_connection to ssh 44071 1727204590.77695: Set connection var ansible_timeout to 10 44071 1727204590.77698: Set connection var ansible_pipelining to False 44071 1727204590.77707: Set connection var ansible_shell_type to sh 44071 1727204590.77710: Set connection var ansible_shell_executable to /bin/sh 44071 1727204590.77719: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204590.77747: variable 'ansible_shell_executable' from source: unknown 44071 1727204590.77751: variable 'ansible_connection' from source: unknown 44071 1727204590.77754: variable 'ansible_module_compression' from source: unknown 44071 1727204590.77756: variable 'ansible_shell_type' from source: unknown 44071 1727204590.77759: variable 'ansible_shell_executable' from source: unknown 44071 1727204590.77761: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204590.77764: variable 'ansible_pipelining' from source: unknown 44071 1727204590.77769: variable 'ansible_timeout' from source: unknown 44071 1727204590.77774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204590.77893: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204590.77900: variable 'omit' from source: magic vars 44071 1727204590.77903: starting attempt loop 44071 1727204590.77905: running the handler 44071 1727204590.77923: handler run complete 44071 1727204590.77929: attempt loop complete, returning result 44071 1727204590.77932: _execute() done 44071 1727204590.77934: dumping result to json 44071 1727204590.77985: done dumping result, returning 44071 1727204590.77989: done running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-00000000002f] 44071 1727204590.77991: sending task result for task 127b8e07-fff9-c964-7471-00000000002f 44071 1727204590.78311: done sending task result for task 127b8e07-fff9-c964-7471-00000000002f 44071 1727204590.78315: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 44071 1727204590.78415: no more pending results, returning what we have 44071 1727204590.78419: results queue empty 44071 1727204590.78420: checking for any_errors_fatal 44071 1727204590.78427: done checking for any_errors_fatal 44071 1727204590.78428: checking for max_fail_percentage 44071 1727204590.78429: done checking for max_fail_percentage 44071 1727204590.78430: checking to see if all hosts have failed and the running result is not ok 44071 1727204590.78431: done checking to see if all hosts have failed 44071 1727204590.78432: getting the remaining hosts for this loop 44071 1727204590.78433: done getting the remaining hosts for this loop 44071 1727204590.78437: getting the next task for host managed-node2 44071 1727204590.78444: done getting next task for host managed-node2 44071 1727204590.78446: ^ task is: TASK: Fix CentOS6 Base repo 44071 1727204590.78449: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204590.78452: getting variables 44071 1727204590.78453: in VariableManager get_vars() 44071 1727204590.78538: Calling all_inventory to load vars for managed-node2 44071 1727204590.78542: Calling groups_inventory to load vars for managed-node2 44071 1727204590.78546: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204590.78556: Calling all_plugins_play to load vars for managed-node2 44071 1727204590.78559: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204590.78570: Calling groups_plugins_play to load vars for managed-node2 44071 1727204590.78809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204590.79050: done with get_vars() 44071 1727204590.79062: done getting variables 44071 1727204590.79198: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 15:03:10 -0400 (0:00:00.042) 0:00:03.108 ***** 44071 1727204590.79229: entering _queue_task() for managed-node2/copy 44071 1727204590.79559: worker is 1 (out of 1 available) 44071 1727204590.79772: exiting _queue_task() for managed-node2/copy 44071 1727204590.79784: done queuing things up, now waiting for results queue to drain 44071 1727204590.79785: waiting for pending results... 44071 1727204590.79913: running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo 44071 1727204590.79971: in run() - task 127b8e07-fff9-c964-7471-000000000031 44071 1727204590.79992: variable 'ansible_search_path' from source: unknown 44071 1727204590.79999: variable 'ansible_search_path' from source: unknown 44071 1727204590.80050: calling self._execute() 44071 1727204590.80139: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204590.80152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204590.80168: variable 'omit' from source: magic vars 44071 1727204590.80690: variable 'ansible_distribution' from source: facts 44071 1727204590.80770: Evaluated conditional (ansible_distribution == 'CentOS'): False 44071 1727204590.80774: when evaluation is False, skipping this task 44071 1727204590.80778: _execute() done 44071 1727204590.80780: dumping result to json 44071 1727204590.80782: done dumping result, returning 44071 1727204590.80784: done running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo [127b8e07-fff9-c964-7471-000000000031] 44071 1727204590.80786: sending task result for task 127b8e07-fff9-c964-7471-000000000031 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 44071 1727204590.80949: no more pending results, returning what we have 44071 1727204590.80953: results queue empty 44071 1727204590.80954: checking for any_errors_fatal 44071 1727204590.80960: done checking for any_errors_fatal 44071 1727204590.80961: checking for max_fail_percentage 44071 1727204590.80962: done checking for max_fail_percentage 44071 1727204590.80963: checking to see if all hosts have failed and the running result is not ok 44071 1727204590.80964: done checking to see if all hosts have failed 44071 1727204590.80964: getting the remaining hosts for this loop 44071 1727204590.80968: done getting the remaining hosts for this loop 44071 1727204590.80973: getting the next task for host managed-node2 44071 1727204590.80980: done getting next task for host managed-node2 44071 1727204590.80983: ^ task is: TASK: Include the task 'enable_epel.yml' 44071 1727204590.80987: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204590.80991: getting variables 44071 1727204590.80992: in VariableManager get_vars() 44071 1727204590.81024: Calling all_inventory to load vars for managed-node2 44071 1727204590.81028: Calling groups_inventory to load vars for managed-node2 44071 1727204590.81032: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204590.81048: Calling all_plugins_play to load vars for managed-node2 44071 1727204590.81052: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204590.81055: Calling groups_plugins_play to load vars for managed-node2 44071 1727204590.81514: done sending task result for task 127b8e07-fff9-c964-7471-000000000031 44071 1727204590.81518: WORKER PROCESS EXITING 44071 1727204590.81546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204590.81820: done with get_vars() 44071 1727204590.81832: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 15:03:10 -0400 (0:00:00.027) 0:00:03.135 ***** 44071 1727204590.81939: entering _queue_task() for managed-node2/include_tasks 44071 1727204590.82258: worker is 1 (out of 1 available) 44071 1727204590.82276: exiting _queue_task() for managed-node2/include_tasks 44071 1727204590.82291: done queuing things up, now waiting for results queue to drain 44071 1727204590.82293: waiting for pending results... 44071 1727204590.82688: running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' 44071 1727204590.82695: in run() - task 127b8e07-fff9-c964-7471-000000000032 44071 1727204590.82698: variable 'ansible_search_path' from source: unknown 44071 1727204590.82705: variable 'ansible_search_path' from source: unknown 44071 1727204590.82753: calling self._execute() 44071 1727204590.82852: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204590.82868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204590.82883: variable 'omit' from source: magic vars 44071 1727204590.83442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204590.86161: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204590.86256: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204590.86306: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204590.86372: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204590.86386: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204590.86485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204590.86594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204590.86597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204590.86609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204590.86629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204590.86764: variable '__network_is_ostree' from source: set_fact 44071 1727204590.86792: Evaluated conditional (not __network_is_ostree | d(false)): True 44071 1727204590.86806: _execute() done 44071 1727204590.86816: dumping result to json 44071 1727204590.86823: done dumping result, returning 44071 1727204590.86870: done running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' [127b8e07-fff9-c964-7471-000000000032] 44071 1727204590.86873: sending task result for task 127b8e07-fff9-c964-7471-000000000032 44071 1727204590.87111: done sending task result for task 127b8e07-fff9-c964-7471-000000000032 44071 1727204590.87114: WORKER PROCESS EXITING 44071 1727204590.87145: no more pending results, returning what we have 44071 1727204590.87151: in VariableManager get_vars() 44071 1727204590.87188: Calling all_inventory to load vars for managed-node2 44071 1727204590.87191: Calling groups_inventory to load vars for managed-node2 44071 1727204590.87195: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204590.87207: Calling all_plugins_play to load vars for managed-node2 44071 1727204590.87210: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204590.87214: Calling groups_plugins_play to load vars for managed-node2 44071 1727204590.87607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204590.87853: done with get_vars() 44071 1727204590.87866: variable 'ansible_search_path' from source: unknown 44071 1727204590.87868: variable 'ansible_search_path' from source: unknown 44071 1727204590.87911: we have included files to process 44071 1727204590.87912: generating all_blocks data 44071 1727204590.87914: done generating all_blocks data 44071 1727204590.87920: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 44071 1727204590.87922: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 44071 1727204590.87924: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 44071 1727204590.88906: done processing included file 44071 1727204590.88909: iterating over new_blocks loaded from include file 44071 1727204590.88910: in VariableManager get_vars() 44071 1727204590.88924: done with get_vars() 44071 1727204590.88926: filtering new block on tags 44071 1727204590.88953: done filtering new block on tags 44071 1727204590.88956: in VariableManager get_vars() 44071 1727204590.88970: done with get_vars() 44071 1727204590.88972: filtering new block on tags 44071 1727204590.88985: done filtering new block on tags 44071 1727204590.88987: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node2 44071 1727204590.88993: extending task lists for all hosts with included blocks 44071 1727204590.89110: done extending task lists 44071 1727204590.89111: done processing included files 44071 1727204590.89112: results queue empty 44071 1727204590.89113: checking for any_errors_fatal 44071 1727204590.89116: done checking for any_errors_fatal 44071 1727204590.89117: checking for max_fail_percentage 44071 1727204590.89118: done checking for max_fail_percentage 44071 1727204590.89119: checking to see if all hosts have failed and the running result is not ok 44071 1727204590.89120: done checking to see if all hosts have failed 44071 1727204590.89121: getting the remaining hosts for this loop 44071 1727204590.89122: done getting the remaining hosts for this loop 44071 1727204590.89124: getting the next task for host managed-node2 44071 1727204590.89129: done getting next task for host managed-node2 44071 1727204590.89131: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 44071 1727204590.89136: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204590.89139: getting variables 44071 1727204590.89140: in VariableManager get_vars() 44071 1727204590.89149: Calling all_inventory to load vars for managed-node2 44071 1727204590.89151: Calling groups_inventory to load vars for managed-node2 44071 1727204590.89154: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204590.89160: Calling all_plugins_play to load vars for managed-node2 44071 1727204590.89186: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204590.89190: Calling groups_plugins_play to load vars for managed-node2 44071 1727204590.89399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204590.89632: done with get_vars() 44071 1727204590.89645: done getting variables 44071 1727204590.89717: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 44071 1727204590.89921: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 40] ********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 15:03:10 -0400 (0:00:00.080) 0:00:03.216 ***** 44071 1727204590.89979: entering _queue_task() for managed-node2/command 44071 1727204590.89981: Creating lock for command 44071 1727204590.90353: worker is 1 (out of 1 available) 44071 1727204590.90474: exiting _queue_task() for managed-node2/command 44071 1727204590.90487: done queuing things up, now waiting for results queue to drain 44071 1727204590.90489: waiting for pending results... 44071 1727204590.90686: running TaskExecutor() for managed-node2/TASK: Create EPEL 40 44071 1727204590.90823: in run() - task 127b8e07-fff9-c964-7471-00000000004c 44071 1727204590.90849: variable 'ansible_search_path' from source: unknown 44071 1727204590.90857: variable 'ansible_search_path' from source: unknown 44071 1727204590.90913: calling self._execute() 44071 1727204590.91007: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204590.91171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204590.91175: variable 'omit' from source: magic vars 44071 1727204590.91475: variable 'ansible_distribution' from source: facts 44071 1727204590.91495: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 44071 1727204590.91508: when evaluation is False, skipping this task 44071 1727204590.91516: _execute() done 44071 1727204590.91524: dumping result to json 44071 1727204590.91532: done dumping result, returning 44071 1727204590.91546: done running TaskExecutor() for managed-node2/TASK: Create EPEL 40 [127b8e07-fff9-c964-7471-00000000004c] 44071 1727204590.91555: sending task result for task 127b8e07-fff9-c964-7471-00000000004c 44071 1727204590.91874: done sending task result for task 127b8e07-fff9-c964-7471-00000000004c 44071 1727204590.91877: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 44071 1727204590.91921: no more pending results, returning what we have 44071 1727204590.91924: results queue empty 44071 1727204590.91925: checking for any_errors_fatal 44071 1727204590.91926: done checking for any_errors_fatal 44071 1727204590.91926: checking for max_fail_percentage 44071 1727204590.91928: done checking for max_fail_percentage 44071 1727204590.91928: checking to see if all hosts have failed and the running result is not ok 44071 1727204590.91929: done checking to see if all hosts have failed 44071 1727204590.91930: getting the remaining hosts for this loop 44071 1727204590.91931: done getting the remaining hosts for this loop 44071 1727204590.91937: getting the next task for host managed-node2 44071 1727204590.91943: done getting next task for host managed-node2 44071 1727204590.91945: ^ task is: TASK: Install yum-utils package 44071 1727204590.91949: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204590.91951: getting variables 44071 1727204590.91953: in VariableManager get_vars() 44071 1727204590.91982: Calling all_inventory to load vars for managed-node2 44071 1727204590.91985: Calling groups_inventory to load vars for managed-node2 44071 1727204590.91989: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204590.92001: Calling all_plugins_play to load vars for managed-node2 44071 1727204590.92004: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204590.92007: Calling groups_plugins_play to load vars for managed-node2 44071 1727204590.92309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204590.92729: done with get_vars() 44071 1727204590.92744: done getting variables 44071 1727204590.92857: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 15:03:10 -0400 (0:00:00.031) 0:00:03.247 ***** 44071 1727204590.93094: entering _queue_task() for managed-node2/package 44071 1727204590.93097: Creating lock for package 44071 1727204590.93753: worker is 1 (out of 1 available) 44071 1727204590.93840: exiting _queue_task() for managed-node2/package 44071 1727204590.93855: done queuing things up, now waiting for results queue to drain 44071 1727204590.93857: waiting for pending results... 44071 1727204590.94020: running TaskExecutor() for managed-node2/TASK: Install yum-utils package 44071 1727204590.94167: in run() - task 127b8e07-fff9-c964-7471-00000000004d 44071 1727204590.94191: variable 'ansible_search_path' from source: unknown 44071 1727204590.94199: variable 'ansible_search_path' from source: unknown 44071 1727204590.94245: calling self._execute() 44071 1727204590.94335: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204590.94348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204590.94362: variable 'omit' from source: magic vars 44071 1727204590.94797: variable 'ansible_distribution' from source: facts 44071 1727204590.94819: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 44071 1727204590.94827: when evaluation is False, skipping this task 44071 1727204590.94836: _execute() done 44071 1727204590.94971: dumping result to json 44071 1727204590.94975: done dumping result, returning 44071 1727204590.94977: done running TaskExecutor() for managed-node2/TASK: Install yum-utils package [127b8e07-fff9-c964-7471-00000000004d] 44071 1727204590.94979: sending task result for task 127b8e07-fff9-c964-7471-00000000004d 44071 1727204590.95052: done sending task result for task 127b8e07-fff9-c964-7471-00000000004d 44071 1727204590.95055: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 44071 1727204590.95101: no more pending results, returning what we have 44071 1727204590.95103: results queue empty 44071 1727204590.95104: checking for any_errors_fatal 44071 1727204590.95110: done checking for any_errors_fatal 44071 1727204590.95111: checking for max_fail_percentage 44071 1727204590.95112: done checking for max_fail_percentage 44071 1727204590.95113: checking to see if all hosts have failed and the running result is not ok 44071 1727204590.95114: done checking to see if all hosts have failed 44071 1727204590.95114: getting the remaining hosts for this loop 44071 1727204590.95116: done getting the remaining hosts for this loop 44071 1727204590.95119: getting the next task for host managed-node2 44071 1727204590.95125: done getting next task for host managed-node2 44071 1727204590.95127: ^ task is: TASK: Enable EPEL 7 44071 1727204590.95131: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204590.95134: getting variables 44071 1727204590.95135: in VariableManager get_vars() 44071 1727204590.95162: Calling all_inventory to load vars for managed-node2 44071 1727204590.95181: Calling groups_inventory to load vars for managed-node2 44071 1727204590.95185: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204590.95196: Calling all_plugins_play to load vars for managed-node2 44071 1727204590.95199: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204590.95202: Calling groups_plugins_play to load vars for managed-node2 44071 1727204590.95408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204590.95655: done with get_vars() 44071 1727204590.95669: done getting variables 44071 1727204590.95731: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 15:03:10 -0400 (0:00:00.026) 0:00:03.274 ***** 44071 1727204590.95768: entering _queue_task() for managed-node2/command 44071 1727204590.96430: worker is 1 (out of 1 available) 44071 1727204590.96445: exiting _queue_task() for managed-node2/command 44071 1727204590.96457: done queuing things up, now waiting for results queue to drain 44071 1727204590.96459: waiting for pending results... 44071 1727204590.97086: running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 44071 1727204590.97156: in run() - task 127b8e07-fff9-c964-7471-00000000004e 44071 1727204590.97169: variable 'ansible_search_path' from source: unknown 44071 1727204590.97371: variable 'ansible_search_path' from source: unknown 44071 1727204590.97374: calling self._execute() 44071 1727204590.97378: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204590.97381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204590.97383: variable 'omit' from source: magic vars 44071 1727204590.97757: variable 'ansible_distribution' from source: facts 44071 1727204590.97773: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 44071 1727204590.97776: when evaluation is False, skipping this task 44071 1727204590.97779: _execute() done 44071 1727204590.97782: dumping result to json 44071 1727204590.97785: done dumping result, returning 44071 1727204590.97794: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 [127b8e07-fff9-c964-7471-00000000004e] 44071 1727204590.97796: sending task result for task 127b8e07-fff9-c964-7471-00000000004e skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 44071 1727204590.97953: no more pending results, returning what we have 44071 1727204590.97957: results queue empty 44071 1727204590.97958: checking for any_errors_fatal 44071 1727204590.97964: done checking for any_errors_fatal 44071 1727204590.97967: checking for max_fail_percentage 44071 1727204590.97969: done checking for max_fail_percentage 44071 1727204590.97969: checking to see if all hosts have failed and the running result is not ok 44071 1727204590.97970: done checking to see if all hosts have failed 44071 1727204590.97971: getting the remaining hosts for this loop 44071 1727204590.97973: done getting the remaining hosts for this loop 44071 1727204590.97978: getting the next task for host managed-node2 44071 1727204590.97985: done getting next task for host managed-node2 44071 1727204590.97988: ^ task is: TASK: Enable EPEL 8 44071 1727204590.97993: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204590.97998: getting variables 44071 1727204590.98000: in VariableManager get_vars() 44071 1727204590.98036: Calling all_inventory to load vars for managed-node2 44071 1727204590.98039: Calling groups_inventory to load vars for managed-node2 44071 1727204590.98043: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204590.98058: Calling all_plugins_play to load vars for managed-node2 44071 1727204590.98062: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204590.98170: Calling groups_plugins_play to load vars for managed-node2 44071 1727204590.98557: done sending task result for task 127b8e07-fff9-c964-7471-00000000004e 44071 1727204590.98562: WORKER PROCESS EXITING 44071 1727204590.98583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204590.99028: done with get_vars() 44071 1727204590.99042: done getting variables 44071 1727204590.99106: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 15:03:10 -0400 (0:00:00.033) 0:00:03.307 ***** 44071 1727204590.99141: entering _queue_task() for managed-node2/command 44071 1727204590.99899: worker is 1 (out of 1 available) 44071 1727204590.99910: exiting _queue_task() for managed-node2/command 44071 1727204590.99921: done queuing things up, now waiting for results queue to drain 44071 1727204590.99923: waiting for pending results... 44071 1727204591.00208: running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 44071 1727204591.00213: in run() - task 127b8e07-fff9-c964-7471-00000000004f 44071 1727204591.00225: variable 'ansible_search_path' from source: unknown 44071 1727204591.00229: variable 'ansible_search_path' from source: unknown 44071 1727204591.00280: calling self._execute() 44071 1727204591.00359: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204591.00373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204591.00385: variable 'omit' from source: magic vars 44071 1727204591.00845: variable 'ansible_distribution' from source: facts 44071 1727204591.00858: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 44071 1727204591.00862: when evaluation is False, skipping this task 44071 1727204591.00867: _execute() done 44071 1727204591.00870: dumping result to json 44071 1727204591.00872: done dumping result, returning 44071 1727204591.00881: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 [127b8e07-fff9-c964-7471-00000000004f] 44071 1727204591.00886: sending task result for task 127b8e07-fff9-c964-7471-00000000004f 44071 1727204591.00998: done sending task result for task 127b8e07-fff9-c964-7471-00000000004f 44071 1727204591.01001: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 44071 1727204591.01074: no more pending results, returning what we have 44071 1727204591.01079: results queue empty 44071 1727204591.01080: checking for any_errors_fatal 44071 1727204591.01085: done checking for any_errors_fatal 44071 1727204591.01086: checking for max_fail_percentage 44071 1727204591.01088: done checking for max_fail_percentage 44071 1727204591.01089: checking to see if all hosts have failed and the running result is not ok 44071 1727204591.01089: done checking to see if all hosts have failed 44071 1727204591.01090: getting the remaining hosts for this loop 44071 1727204591.01092: done getting the remaining hosts for this loop 44071 1727204591.01098: getting the next task for host managed-node2 44071 1727204591.01108: done getting next task for host managed-node2 44071 1727204591.01111: ^ task is: TASK: Enable EPEL 6 44071 1727204591.01116: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204591.01121: getting variables 44071 1727204591.01123: in VariableManager get_vars() 44071 1727204591.01163: Calling all_inventory to load vars for managed-node2 44071 1727204591.01284: Calling groups_inventory to load vars for managed-node2 44071 1727204591.01288: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204591.01300: Calling all_plugins_play to load vars for managed-node2 44071 1727204591.01303: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204591.01306: Calling groups_plugins_play to load vars for managed-node2 44071 1727204591.01622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204591.01874: done with get_vars() 44071 1727204591.01887: done getting variables 44071 1727204591.01960: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.028) 0:00:03.336 ***** 44071 1727204591.01995: entering _queue_task() for managed-node2/copy 44071 1727204591.02332: worker is 1 (out of 1 available) 44071 1727204591.02349: exiting _queue_task() for managed-node2/copy 44071 1727204591.02363: done queuing things up, now waiting for results queue to drain 44071 1727204591.02469: waiting for pending results... 44071 1727204591.02784: running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 44071 1727204591.02790: in run() - task 127b8e07-fff9-c964-7471-000000000051 44071 1727204591.02793: variable 'ansible_search_path' from source: unknown 44071 1727204591.02795: variable 'ansible_search_path' from source: unknown 44071 1727204591.02971: calling self._execute() 44071 1727204591.02975: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204591.02979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204591.02982: variable 'omit' from source: magic vars 44071 1727204591.03473: variable 'ansible_distribution' from source: facts 44071 1727204591.03486: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 44071 1727204591.03490: when evaluation is False, skipping this task 44071 1727204591.03493: _execute() done 44071 1727204591.03496: dumping result to json 44071 1727204591.03499: done dumping result, returning 44071 1727204591.03507: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 [127b8e07-fff9-c964-7471-000000000051] 44071 1727204591.03512: sending task result for task 127b8e07-fff9-c964-7471-000000000051 44071 1727204591.03620: done sending task result for task 127b8e07-fff9-c964-7471-000000000051 44071 1727204591.03624: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 44071 1727204591.03681: no more pending results, returning what we have 44071 1727204591.03685: results queue empty 44071 1727204591.03686: checking for any_errors_fatal 44071 1727204591.03693: done checking for any_errors_fatal 44071 1727204591.03694: checking for max_fail_percentage 44071 1727204591.03695: done checking for max_fail_percentage 44071 1727204591.03696: checking to see if all hosts have failed and the running result is not ok 44071 1727204591.03697: done checking to see if all hosts have failed 44071 1727204591.03697: getting the remaining hosts for this loop 44071 1727204591.03699: done getting the remaining hosts for this loop 44071 1727204591.03704: getting the next task for host managed-node2 44071 1727204591.03713: done getting next task for host managed-node2 44071 1727204591.03716: ^ task is: TASK: Set network provider to 'nm' 44071 1727204591.03719: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204591.03724: getting variables 44071 1727204591.03726: in VariableManager get_vars() 44071 1727204591.03762: Calling all_inventory to load vars for managed-node2 44071 1727204591.03767: Calling groups_inventory to load vars for managed-node2 44071 1727204591.03771: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204591.03787: Calling all_plugins_play to load vars for managed-node2 44071 1727204591.03790: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204591.03793: Calling groups_plugins_play to load vars for managed-node2 44071 1727204591.04282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204591.04536: done with get_vars() 44071 1727204591.04549: done getting variables 44071 1727204591.04615: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:13 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.026) 0:00:03.362 ***** 44071 1727204591.04653: entering _queue_task() for managed-node2/set_fact 44071 1727204591.05094: worker is 1 (out of 1 available) 44071 1727204591.05107: exiting _queue_task() for managed-node2/set_fact 44071 1727204591.05121: done queuing things up, now waiting for results queue to drain 44071 1727204591.05122: waiting for pending results... 44071 1727204591.05485: running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' 44071 1727204591.05492: in run() - task 127b8e07-fff9-c964-7471-000000000007 44071 1727204591.05495: variable 'ansible_search_path' from source: unknown 44071 1727204591.05498: calling self._execute() 44071 1727204591.05772: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204591.05776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204591.05779: variable 'omit' from source: magic vars 44071 1727204591.05781: variable 'omit' from source: magic vars 44071 1727204591.05784: variable 'omit' from source: magic vars 44071 1727204591.05786: variable 'omit' from source: magic vars 44071 1727204591.05973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204591.05977: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204591.05979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204591.05982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204591.05985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204591.05988: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204591.05990: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204591.05993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204591.06058: Set connection var ansible_connection to ssh 44071 1727204591.06065: Set connection var ansible_timeout to 10 44071 1727204591.06072: Set connection var ansible_pipelining to False 44071 1727204591.06079: Set connection var ansible_shell_type to sh 44071 1727204591.06085: Set connection var ansible_shell_executable to /bin/sh 44071 1727204591.06093: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204591.06117: variable 'ansible_shell_executable' from source: unknown 44071 1727204591.06121: variable 'ansible_connection' from source: unknown 44071 1727204591.06124: variable 'ansible_module_compression' from source: unknown 44071 1727204591.06127: variable 'ansible_shell_type' from source: unknown 44071 1727204591.06129: variable 'ansible_shell_executable' from source: unknown 44071 1727204591.06132: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204591.06139: variable 'ansible_pipelining' from source: unknown 44071 1727204591.06142: variable 'ansible_timeout' from source: unknown 44071 1727204591.06147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204591.06473: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204591.06477: variable 'omit' from source: magic vars 44071 1727204591.06479: starting attempt loop 44071 1727204591.06481: running the handler 44071 1727204591.06484: handler run complete 44071 1727204591.06485: attempt loop complete, returning result 44071 1727204591.06487: _execute() done 44071 1727204591.06493: dumping result to json 44071 1727204591.06495: done dumping result, returning 44071 1727204591.06497: done running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' [127b8e07-fff9-c964-7471-000000000007] 44071 1727204591.06499: sending task result for task 127b8e07-fff9-c964-7471-000000000007 44071 1727204591.06567: done sending task result for task 127b8e07-fff9-c964-7471-000000000007 44071 1727204591.06570: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 44071 1727204591.06654: no more pending results, returning what we have 44071 1727204591.06657: results queue empty 44071 1727204591.06658: checking for any_errors_fatal 44071 1727204591.06665: done checking for any_errors_fatal 44071 1727204591.06668: checking for max_fail_percentage 44071 1727204591.06669: done checking for max_fail_percentage 44071 1727204591.06670: checking to see if all hosts have failed and the running result is not ok 44071 1727204591.06671: done checking to see if all hosts have failed 44071 1727204591.06672: getting the remaining hosts for this loop 44071 1727204591.06673: done getting the remaining hosts for this loop 44071 1727204591.06678: getting the next task for host managed-node2 44071 1727204591.06684: done getting next task for host managed-node2 44071 1727204591.06687: ^ task is: TASK: meta (flush_handlers) 44071 1727204591.06689: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204591.06693: getting variables 44071 1727204591.06694: in VariableManager get_vars() 44071 1727204591.06730: Calling all_inventory to load vars for managed-node2 44071 1727204591.06735: Calling groups_inventory to load vars for managed-node2 44071 1727204591.06739: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204591.06751: Calling all_plugins_play to load vars for managed-node2 44071 1727204591.06754: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204591.06757: Calling groups_plugins_play to load vars for managed-node2 44071 1727204591.07073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204591.07353: done with get_vars() 44071 1727204591.07367: done getting variables 44071 1727204591.07442: in VariableManager get_vars() 44071 1727204591.07453: Calling all_inventory to load vars for managed-node2 44071 1727204591.07460: Calling groups_inventory to load vars for managed-node2 44071 1727204591.07463: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204591.07470: Calling all_plugins_play to load vars for managed-node2 44071 1727204591.07473: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204591.07476: Calling groups_plugins_play to load vars for managed-node2 44071 1727204591.07652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204591.07881: done with get_vars() 44071 1727204591.07902: done queuing things up, now waiting for results queue to drain 44071 1727204591.07904: results queue empty 44071 1727204591.07905: checking for any_errors_fatal 44071 1727204591.07908: done checking for any_errors_fatal 44071 1727204591.07909: checking for max_fail_percentage 44071 1727204591.07910: done checking for max_fail_percentage 44071 1727204591.07911: checking to see if all hosts have failed and the running result is not ok 44071 1727204591.07912: done checking to see if all hosts have failed 44071 1727204591.07912: getting the remaining hosts for this loop 44071 1727204591.07913: done getting the remaining hosts for this loop 44071 1727204591.07916: getting the next task for host managed-node2 44071 1727204591.07920: done getting next task for host managed-node2 44071 1727204591.07922: ^ task is: TASK: meta (flush_handlers) 44071 1727204591.07924: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204591.07932: getting variables 44071 1727204591.07936: in VariableManager get_vars() 44071 1727204591.07945: Calling all_inventory to load vars for managed-node2 44071 1727204591.07947: Calling groups_inventory to load vars for managed-node2 44071 1727204591.07950: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204591.07955: Calling all_plugins_play to load vars for managed-node2 44071 1727204591.07957: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204591.07961: Calling groups_plugins_play to load vars for managed-node2 44071 1727204591.08142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204591.08394: done with get_vars() 44071 1727204591.08403: done getting variables 44071 1727204591.08468: in VariableManager get_vars() 44071 1727204591.08478: Calling all_inventory to load vars for managed-node2 44071 1727204591.08480: Calling groups_inventory to load vars for managed-node2 44071 1727204591.08483: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204591.08487: Calling all_plugins_play to load vars for managed-node2 44071 1727204591.08490: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204591.08493: Calling groups_plugins_play to load vars for managed-node2 44071 1727204591.08669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204591.08909: done with get_vars() 44071 1727204591.08922: done queuing things up, now waiting for results queue to drain 44071 1727204591.08924: results queue empty 44071 1727204591.08925: checking for any_errors_fatal 44071 1727204591.08926: done checking for any_errors_fatal 44071 1727204591.08927: checking for max_fail_percentage 44071 1727204591.08928: done checking for max_fail_percentage 44071 1727204591.08928: checking to see if all hosts have failed and the running result is not ok 44071 1727204591.08929: done checking to see if all hosts have failed 44071 1727204591.08930: getting the remaining hosts for this loop 44071 1727204591.08931: done getting the remaining hosts for this loop 44071 1727204591.08936: getting the next task for host managed-node2 44071 1727204591.08940: done getting next task for host managed-node2 44071 1727204591.08941: ^ task is: None 44071 1727204591.08942: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204591.08944: done queuing things up, now waiting for results queue to drain 44071 1727204591.08945: results queue empty 44071 1727204591.08945: checking for any_errors_fatal 44071 1727204591.08946: done checking for any_errors_fatal 44071 1727204591.08947: checking for max_fail_percentage 44071 1727204591.08948: done checking for max_fail_percentage 44071 1727204591.08948: checking to see if all hosts have failed and the running result is not ok 44071 1727204591.08949: done checking to see if all hosts have failed 44071 1727204591.08951: getting the next task for host managed-node2 44071 1727204591.08953: done getting next task for host managed-node2 44071 1727204591.08954: ^ task is: None 44071 1727204591.08956: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204591.09014: in VariableManager get_vars() 44071 1727204591.09030: done with get_vars() 44071 1727204591.09040: in VariableManager get_vars() 44071 1727204591.09052: done with get_vars() 44071 1727204591.09057: variable 'omit' from source: magic vars 44071 1727204591.09094: in VariableManager get_vars() 44071 1727204591.09108: done with get_vars() 44071 1727204591.09135: variable 'omit' from source: magic vars PLAY [Play for testing states] ************************************************* 44071 1727204591.10146: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 44071 1727204591.10186: getting the remaining hosts for this loop 44071 1727204591.10188: done getting the remaining hosts for this loop 44071 1727204591.10191: getting the next task for host managed-node2 44071 1727204591.10194: done getting next task for host managed-node2 44071 1727204591.10197: ^ task is: TASK: Gathering Facts 44071 1727204591.10198: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204591.10201: getting variables 44071 1727204591.10202: in VariableManager get_vars() 44071 1727204591.10211: Calling all_inventory to load vars for managed-node2 44071 1727204591.10214: Calling groups_inventory to load vars for managed-node2 44071 1727204591.10217: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204591.10223: Calling all_plugins_play to load vars for managed-node2 44071 1727204591.10357: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204591.10363: Calling groups_plugins_play to load vars for managed-node2 44071 1727204591.10659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204591.11117: done with get_vars() 44071 1727204591.11129: done getting variables 44071 1727204591.11393: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:3 Tuesday 24 September 2024 15:03:11 -0400 (0:00:00.067) 0:00:03.430 ***** 44071 1727204591.11419: entering _queue_task() for managed-node2/gather_facts 44071 1727204591.11943: worker is 1 (out of 1 available) 44071 1727204591.11958: exiting _queue_task() for managed-node2/gather_facts 44071 1727204591.12173: done queuing things up, now waiting for results queue to drain 44071 1727204591.12176: waiting for pending results... 44071 1727204591.12683: running TaskExecutor() for managed-node2/TASK: Gathering Facts 44071 1727204591.12688: in run() - task 127b8e07-fff9-c964-7471-000000000077 44071 1727204591.12691: variable 'ansible_search_path' from source: unknown 44071 1727204591.13073: calling self._execute() 44071 1727204591.13077: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204591.13080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204591.13083: variable 'omit' from source: magic vars 44071 1727204591.13789: variable 'ansible_distribution_major_version' from source: facts 44071 1727204591.13810: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204591.13822: variable 'omit' from source: magic vars 44071 1727204591.13853: variable 'omit' from source: magic vars 44071 1727204591.14271: variable 'omit' from source: magic vars 44071 1727204591.14275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204591.14279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204591.14282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204591.14284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204591.14286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204591.14671: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204591.14675: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204591.14678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204591.14680: Set connection var ansible_connection to ssh 44071 1727204591.14682: Set connection var ansible_timeout to 10 44071 1727204591.14685: Set connection var ansible_pipelining to False 44071 1727204591.14687: Set connection var ansible_shell_type to sh 44071 1727204591.14689: Set connection var ansible_shell_executable to /bin/sh 44071 1727204591.14691: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204591.14693: variable 'ansible_shell_executable' from source: unknown 44071 1727204591.14695: variable 'ansible_connection' from source: unknown 44071 1727204591.14697: variable 'ansible_module_compression' from source: unknown 44071 1727204591.14699: variable 'ansible_shell_type' from source: unknown 44071 1727204591.14701: variable 'ansible_shell_executable' from source: unknown 44071 1727204591.14703: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204591.14705: variable 'ansible_pipelining' from source: unknown 44071 1727204591.14973: variable 'ansible_timeout' from source: unknown 44071 1727204591.14976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204591.15194: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204591.15212: variable 'omit' from source: magic vars 44071 1727204591.15222: starting attempt loop 44071 1727204591.15229: running the handler 44071 1727204591.15250: variable 'ansible_facts' from source: unknown 44071 1727204591.15393: _low_level_execute_command(): starting 44071 1727204591.15405: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204591.16780: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204591.17083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204591.17102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204591.17124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204591.17231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204591.19003: stdout chunk (state=3): >>>/root <<< 44071 1727204591.19306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204591.19325: stdout chunk (state=3): >>><<< 44071 1727204591.19339: stderr chunk (state=3): >>><<< 44071 1727204591.19390: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204591.19443: _low_level_execute_command(): starting 44071 1727204591.19456: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204591.194269-44367-22270293570921 `" && echo ansible-tmp-1727204591.194269-44367-22270293570921="` echo /root/.ansible/tmp/ansible-tmp-1727204591.194269-44367-22270293570921 `" ) && sleep 0' 44071 1727204591.21161: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204591.21223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204591.21265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204591.21374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204591.23437: stdout chunk (state=3): >>>ansible-tmp-1727204591.194269-44367-22270293570921=/root/.ansible/tmp/ansible-tmp-1727204591.194269-44367-22270293570921 <<< 44071 1727204591.23625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204591.23629: stdout chunk (state=3): >>><<< 44071 1727204591.23635: stderr chunk (state=3): >>><<< 44071 1727204591.23658: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204591.194269-44367-22270293570921=/root/.ansible/tmp/ansible-tmp-1727204591.194269-44367-22270293570921 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204591.23707: variable 'ansible_module_compression' from source: unknown 44071 1727204591.23974: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 44071 1727204591.23997: variable 'ansible_facts' from source: unknown 44071 1727204591.24430: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204591.194269-44367-22270293570921/AnsiballZ_setup.py 44071 1727204591.24860: Sending initial data 44071 1727204591.24873: Sent initial data (152 bytes) 44071 1727204591.25978: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204591.25982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204591.25986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204591.26242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204591.26489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204591.26575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204591.28196: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204591.28323: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204591.28353: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpnegj2w69 /root/.ansible/tmp/ansible-tmp-1727204591.194269-44367-22270293570921/AnsiballZ_setup.py <<< 44071 1727204591.28356: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204591.194269-44367-22270293570921/AnsiballZ_setup.py" <<< 44071 1727204591.28402: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpnegj2w69" to remote "/root/.ansible/tmp/ansible-tmp-1727204591.194269-44367-22270293570921/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204591.194269-44367-22270293570921/AnsiballZ_setup.py" <<< 44071 1727204591.31776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204591.32003: stderr chunk (state=3): >>><<< 44071 1727204591.32008: stdout chunk (state=3): >>><<< 44071 1727204591.32010: done transferring module to remote 44071 1727204591.32013: _low_level_execute_command(): starting 44071 1727204591.32015: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204591.194269-44367-22270293570921/ /root/.ansible/tmp/ansible-tmp-1727204591.194269-44367-22270293570921/AnsiballZ_setup.py && sleep 0' 44071 1727204591.33343: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204591.33482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204591.33725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204591.33855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204591.35809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204591.35929: stderr chunk (state=3): >>><<< 44071 1727204591.36073: stdout chunk (state=3): >>><<< 44071 1727204591.36078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204591.36087: _low_level_execute_command(): starting 44071 1727204591.36090: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204591.194269-44367-22270293570921/AnsiballZ_setup.py && sleep 0' 44071 1727204591.37799: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204591.37984: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204591.38025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204591.38094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204591.38222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204591.38304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204592.20592: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.958984375, "5m": 0.66162109375, "15m": 0.4033203125}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3036, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 680, "free": 3036}, "nocache": {"free": 3483, "used": 233}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4<<< 44071 1727204592.20615: stdout chunk (state=3): >>>", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 938, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251310776320, "block_size": 4096, "block_total": 64479564, "block_available": 61355170, "block_used": 3124394, "inode_total": 16384000, "inode_available": 16301246, "inode_used": 82754, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "12", "epoch": "1727204592", "epoch_int": "1727204592", "date": "2024-09-24", "time": "15:03:12", "iso8601_micro": "2024-09-24T19:03:12.159427Z", "iso8601": "2024-09-24T19:03:12Z", "iso8601_basic": "20240924T150312159427", "iso8601_basic_short": "20240924T150312", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 44071 1727204592.22510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204592.22599: stderr chunk (state=3): >>><<< 44071 1727204592.22611: stdout chunk (state=3): >>><<< 44071 1727204592.22658: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.958984375, "5m": 0.66162109375, "15m": 0.4033203125}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3036, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 680, "free": 3036}, "nocache": {"free": 3483, "used": 233}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 938, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251310776320, "block_size": 4096, "block_total": 64479564, "block_available": 61355170, "block_used": 3124394, "inode_total": 16384000, "inode_available": 16301246, "inode_used": 82754, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "12", "epoch": "1727204592", "epoch_int": "1727204592", "date": "2024-09-24", "time": "15:03:12", "iso8601_micro": "2024-09-24T19:03:12.159427Z", "iso8601": "2024-09-24T19:03:12Z", "iso8601_basic": "20240924T150312159427", "iso8601_basic_short": "20240924T150312", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204592.23100: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204591.194269-44367-22270293570921/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204592.23322: _low_level_execute_command(): starting 44071 1727204592.23326: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204591.194269-44367-22270293570921/ > /dev/null 2>&1 && sleep 0' 44071 1727204592.24778: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204592.24782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204592.25197: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204592.27210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204592.27292: stderr chunk (state=3): >>><<< 44071 1727204592.27492: stdout chunk (state=3): >>><<< 44071 1727204592.27496: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204592.27499: handler run complete 44071 1727204592.27925: variable 'ansible_facts' from source: unknown 44071 1727204592.27992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204592.28840: variable 'ansible_facts' from source: unknown 44071 1727204592.29124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204592.29395: attempt loop complete, returning result 44071 1727204592.29450: _execute() done 44071 1727204592.29458: dumping result to json 44071 1727204592.29673: done dumping result, returning 44071 1727204592.29677: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-c964-7471-000000000077] 44071 1727204592.29679: sending task result for task 127b8e07-fff9-c964-7471-000000000077 ok: [managed-node2] 44071 1727204592.30892: no more pending results, returning what we have 44071 1727204592.30896: results queue empty 44071 1727204592.30897: checking for any_errors_fatal 44071 1727204592.30899: done checking for any_errors_fatal 44071 1727204592.30900: checking for max_fail_percentage 44071 1727204592.30901: done checking for max_fail_percentage 44071 1727204592.30902: checking to see if all hosts have failed and the running result is not ok 44071 1727204592.30903: done checking to see if all hosts have failed 44071 1727204592.30904: getting the remaining hosts for this loop 44071 1727204592.30905: done getting the remaining hosts for this loop 44071 1727204592.30910: getting the next task for host managed-node2 44071 1727204592.30916: done getting next task for host managed-node2 44071 1727204592.30918: ^ task is: TASK: meta (flush_handlers) 44071 1727204592.30920: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204592.30924: getting variables 44071 1727204592.30926: in VariableManager get_vars() 44071 1727204592.30956: Calling all_inventory to load vars for managed-node2 44071 1727204592.30961: Calling groups_inventory to load vars for managed-node2 44071 1727204592.31321: done sending task result for task 127b8e07-fff9-c964-7471-000000000077 44071 1727204592.31324: WORKER PROCESS EXITING 44071 1727204592.30965: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204592.31339: Calling all_plugins_play to load vars for managed-node2 44071 1727204592.31341: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204592.31344: Calling groups_plugins_play to load vars for managed-node2 44071 1727204592.31533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204592.32400: done with get_vars() 44071 1727204592.32420: done getting variables 44071 1727204592.32516: in VariableManager get_vars() 44071 1727204592.32528: Calling all_inventory to load vars for managed-node2 44071 1727204592.32531: Calling groups_inventory to load vars for managed-node2 44071 1727204592.32536: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204592.32542: Calling all_plugins_play to load vars for managed-node2 44071 1727204592.32544: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204592.32547: Calling groups_plugins_play to load vars for managed-node2 44071 1727204592.32940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204592.33612: done with get_vars() 44071 1727204592.33630: done queuing things up, now waiting for results queue to drain 44071 1727204592.33632: results queue empty 44071 1727204592.33636: checking for any_errors_fatal 44071 1727204592.33640: done checking for any_errors_fatal 44071 1727204592.33641: checking for max_fail_percentage 44071 1727204592.33642: done checking for max_fail_percentage 44071 1727204592.33643: checking to see if all hosts have failed and the running result is not ok 44071 1727204592.33644: done checking to see if all hosts have failed 44071 1727204592.33645: getting the remaining hosts for this loop 44071 1727204592.33651: done getting the remaining hosts for this loop 44071 1727204592.33655: getting the next task for host managed-node2 44071 1727204592.33659: done getting next task for host managed-node2 44071 1727204592.33661: ^ task is: TASK: Show playbook name 44071 1727204592.33663: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204592.33667: getting variables 44071 1727204592.33668: in VariableManager get_vars() 44071 1727204592.33678: Calling all_inventory to load vars for managed-node2 44071 1727204592.33681: Calling groups_inventory to load vars for managed-node2 44071 1727204592.33683: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204592.33690: Calling all_plugins_play to load vars for managed-node2 44071 1727204592.33692: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204592.33695: Calling groups_plugins_play to load vars for managed-node2 44071 1727204592.34077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204592.34521: done with get_vars() 44071 1727204592.34536: done getting variables 44071 1727204592.34840: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:11 Tuesday 24 September 2024 15:03:12 -0400 (0:00:01.234) 0:00:04.665 ***** 44071 1727204592.34874: entering _queue_task() for managed-node2/debug 44071 1727204592.34876: Creating lock for debug 44071 1727204592.35641: worker is 1 (out of 1 available) 44071 1727204592.35659: exiting _queue_task() for managed-node2/debug 44071 1727204592.35676: done queuing things up, now waiting for results queue to drain 44071 1727204592.35678: waiting for pending results... 44071 1727204592.36143: running TaskExecutor() for managed-node2/TASK: Show playbook name 44071 1727204592.36219: in run() - task 127b8e07-fff9-c964-7471-00000000000b 44071 1727204592.36237: variable 'ansible_search_path' from source: unknown 44071 1727204592.36481: calling self._execute() 44071 1727204592.36562: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.36572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.36584: variable 'omit' from source: magic vars 44071 1727204592.37199: variable 'ansible_distribution_major_version' from source: facts 44071 1727204592.37271: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204592.37276: variable 'omit' from source: magic vars 44071 1727204592.37278: variable 'omit' from source: magic vars 44071 1727204592.37318: variable 'omit' from source: magic vars 44071 1727204592.37372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204592.37422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204592.37448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204592.37473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.37670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.37673: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204592.37676: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.37678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.37681: Set connection var ansible_connection to ssh 44071 1727204592.37683: Set connection var ansible_timeout to 10 44071 1727204592.37685: Set connection var ansible_pipelining to False 44071 1727204592.37687: Set connection var ansible_shell_type to sh 44071 1727204592.37690: Set connection var ansible_shell_executable to /bin/sh 44071 1727204592.37703: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204592.37732: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.37741: variable 'ansible_connection' from source: unknown 44071 1727204592.37747: variable 'ansible_module_compression' from source: unknown 44071 1727204592.37754: variable 'ansible_shell_type' from source: unknown 44071 1727204592.37760: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.37769: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.37778: variable 'ansible_pipelining' from source: unknown 44071 1727204592.37784: variable 'ansible_timeout' from source: unknown 44071 1727204592.37791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.37969: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204592.37986: variable 'omit' from source: magic vars 44071 1727204592.37996: starting attempt loop 44071 1727204592.38002: running the handler 44071 1727204592.38060: handler run complete 44071 1727204592.38100: attempt loop complete, returning result 44071 1727204592.38108: _execute() done 44071 1727204592.38116: dumping result to json 44071 1727204592.38124: done dumping result, returning 44071 1727204592.38142: done running TaskExecutor() for managed-node2/TASK: Show playbook name [127b8e07-fff9-c964-7471-00000000000b] 44071 1727204592.38151: sending task result for task 127b8e07-fff9-c964-7471-00000000000b 44071 1727204592.38320: done sending task result for task 127b8e07-fff9-c964-7471-00000000000b 44071 1727204592.38324: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: this is: playbooks/tests_states.yml 44071 1727204592.38391: no more pending results, returning what we have 44071 1727204592.38395: results queue empty 44071 1727204592.38395: checking for any_errors_fatal 44071 1727204592.38397: done checking for any_errors_fatal 44071 1727204592.38397: checking for max_fail_percentage 44071 1727204592.38399: done checking for max_fail_percentage 44071 1727204592.38399: checking to see if all hosts have failed and the running result is not ok 44071 1727204592.38400: done checking to see if all hosts have failed 44071 1727204592.38401: getting the remaining hosts for this loop 44071 1727204592.38403: done getting the remaining hosts for this loop 44071 1727204592.38408: getting the next task for host managed-node2 44071 1727204592.38414: done getting next task for host managed-node2 44071 1727204592.38418: ^ task is: TASK: Include the task 'run_test.yml' 44071 1727204592.38421: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204592.38423: getting variables 44071 1727204592.38425: in VariableManager get_vars() 44071 1727204592.38458: Calling all_inventory to load vars for managed-node2 44071 1727204592.38461: Calling groups_inventory to load vars for managed-node2 44071 1727204592.38465: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204592.38478: Calling all_plugins_play to load vars for managed-node2 44071 1727204592.38481: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204592.38484: Calling groups_plugins_play to load vars for managed-node2 44071 1727204592.39025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204592.39495: done with get_vars() 44071 1727204592.39509: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:22 Tuesday 24 September 2024 15:03:12 -0400 (0:00:00.049) 0:00:04.714 ***** 44071 1727204592.39825: entering _queue_task() for managed-node2/include_tasks 44071 1727204592.41197: worker is 1 (out of 1 available) 44071 1727204592.41212: exiting _queue_task() for managed-node2/include_tasks 44071 1727204592.41227: done queuing things up, now waiting for results queue to drain 44071 1727204592.41229: waiting for pending results... 44071 1727204592.42388: running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' 44071 1727204592.42394: in run() - task 127b8e07-fff9-c964-7471-00000000000d 44071 1727204592.42398: variable 'ansible_search_path' from source: unknown 44071 1727204592.42875: calling self._execute() 44071 1727204592.42883: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.42886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.42889: variable 'omit' from source: magic vars 44071 1727204592.43728: variable 'ansible_distribution_major_version' from source: facts 44071 1727204592.44174: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204592.44178: _execute() done 44071 1727204592.44180: dumping result to json 44071 1727204592.44182: done dumping result, returning 44071 1727204592.44184: done running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' [127b8e07-fff9-c964-7471-00000000000d] 44071 1727204592.44186: sending task result for task 127b8e07-fff9-c964-7471-00000000000d 44071 1727204592.44315: no more pending results, returning what we have 44071 1727204592.44321: in VariableManager get_vars() 44071 1727204592.44359: Calling all_inventory to load vars for managed-node2 44071 1727204592.44362: Calling groups_inventory to load vars for managed-node2 44071 1727204592.44370: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204592.44383: Calling all_plugins_play to load vars for managed-node2 44071 1727204592.44386: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204592.44389: Calling groups_plugins_play to load vars for managed-node2 44071 1727204592.44792: done sending task result for task 127b8e07-fff9-c964-7471-00000000000d 44071 1727204592.44797: WORKER PROCESS EXITING 44071 1727204592.45085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204592.45733: done with get_vars() 44071 1727204592.45746: variable 'ansible_search_path' from source: unknown 44071 1727204592.45762: we have included files to process 44071 1727204592.45763: generating all_blocks data 44071 1727204592.45764: done generating all_blocks data 44071 1727204592.45767: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204592.45768: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204592.45771: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204592.48394: in VariableManager get_vars() 44071 1727204592.48417: done with get_vars() 44071 1727204592.48473: in VariableManager get_vars() 44071 1727204592.48491: done with get_vars() 44071 1727204592.48532: in VariableManager get_vars() 44071 1727204592.48550: done with get_vars() 44071 1727204592.48701: in VariableManager get_vars() 44071 1727204592.48718: done with get_vars() 44071 1727204592.48783: in VariableManager get_vars() 44071 1727204592.48800: done with get_vars() 44071 1727204592.49716: in VariableManager get_vars() 44071 1727204592.49740: done with get_vars() 44071 1727204592.49755: done processing included file 44071 1727204592.49757: iterating over new_blocks loaded from include file 44071 1727204592.49759: in VariableManager get_vars() 44071 1727204592.49857: done with get_vars() 44071 1727204592.49860: filtering new block on tags 44071 1727204592.49987: done filtering new block on tags 44071 1727204592.49991: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node2 44071 1727204592.49997: extending task lists for all hosts with included blocks 44071 1727204592.50044: done extending task lists 44071 1727204592.50045: done processing included files 44071 1727204592.50046: results queue empty 44071 1727204592.50047: checking for any_errors_fatal 44071 1727204592.50053: done checking for any_errors_fatal 44071 1727204592.50054: checking for max_fail_percentage 44071 1727204592.50055: done checking for max_fail_percentage 44071 1727204592.50056: checking to see if all hosts have failed and the running result is not ok 44071 1727204592.50057: done checking to see if all hosts have failed 44071 1727204592.50058: getting the remaining hosts for this loop 44071 1727204592.50059: done getting the remaining hosts for this loop 44071 1727204592.50062: getting the next task for host managed-node2 44071 1727204592.50069: done getting next task for host managed-node2 44071 1727204592.50071: ^ task is: TASK: TEST: {{ lsr_description }} 44071 1727204592.50076: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204592.50079: getting variables 44071 1727204592.50080: in VariableManager get_vars() 44071 1727204592.50090: Calling all_inventory to load vars for managed-node2 44071 1727204592.50092: Calling groups_inventory to load vars for managed-node2 44071 1727204592.50094: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204592.50100: Calling all_plugins_play to load vars for managed-node2 44071 1727204592.50102: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204592.50105: Calling groups_plugins_play to load vars for managed-node2 44071 1727204592.50628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204592.51183: done with get_vars() 44071 1727204592.51198: done getting variables 44071 1727204592.51258: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204592.51630: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 15:03:12 -0400 (0:00:00.118) 0:00:04.833 ***** 44071 1727204592.51685: entering _queue_task() for managed-node2/debug 44071 1727204592.52262: worker is 1 (out of 1 available) 44071 1727204592.52495: exiting _queue_task() for managed-node2/debug 44071 1727204592.52507: done queuing things up, now waiting for results queue to drain 44071 1727204592.52509: waiting for pending results... 44071 1727204592.52771: running TaskExecutor() for managed-node2/TASK: TEST: I can create a profile 44071 1727204592.53066: in run() - task 127b8e07-fff9-c964-7471-000000000091 44071 1727204592.53142: variable 'ansible_search_path' from source: unknown 44071 1727204592.53151: variable 'ansible_search_path' from source: unknown 44071 1727204592.53202: calling self._execute() 44071 1727204592.53414: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.53572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.53666: variable 'omit' from source: magic vars 44071 1727204592.54032: variable 'ansible_distribution_major_version' from source: facts 44071 1727204592.54053: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204592.54064: variable 'omit' from source: magic vars 44071 1727204592.54123: variable 'omit' from source: magic vars 44071 1727204592.54259: variable 'lsr_description' from source: include params 44071 1727204592.54287: variable 'omit' from source: magic vars 44071 1727204592.54350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204592.54571: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204592.54792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204592.54797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.54800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.54803: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204592.54806: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.54809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.54979: Set connection var ansible_connection to ssh 44071 1727204592.55093: Set connection var ansible_timeout to 10 44071 1727204592.55104: Set connection var ansible_pipelining to False 44071 1727204592.55118: Set connection var ansible_shell_type to sh 44071 1727204592.55128: Set connection var ansible_shell_executable to /bin/sh 44071 1727204592.55142: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204592.55254: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.55262: variable 'ansible_connection' from source: unknown 44071 1727204592.55271: variable 'ansible_module_compression' from source: unknown 44071 1727204592.55277: variable 'ansible_shell_type' from source: unknown 44071 1727204592.55283: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.55289: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.55296: variable 'ansible_pipelining' from source: unknown 44071 1727204592.55302: variable 'ansible_timeout' from source: unknown 44071 1727204592.55309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.55770: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204592.55776: variable 'omit' from source: magic vars 44071 1727204592.55778: starting attempt loop 44071 1727204592.55781: running the handler 44071 1727204592.55783: handler run complete 44071 1727204592.55972: attempt loop complete, returning result 44071 1727204592.55976: _execute() done 44071 1727204592.55980: dumping result to json 44071 1727204592.55983: done dumping result, returning 44071 1727204592.55985: done running TaskExecutor() for managed-node2/TASK: TEST: I can create a profile [127b8e07-fff9-c964-7471-000000000091] 44071 1727204592.55987: sending task result for task 127b8e07-fff9-c964-7471-000000000091 44071 1727204592.56068: done sending task result for task 127b8e07-fff9-c964-7471-000000000091 44071 1727204592.56073: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ########## I can create a profile ########## 44071 1727204592.56130: no more pending results, returning what we have 44071 1727204592.56137: results queue empty 44071 1727204592.56138: checking for any_errors_fatal 44071 1727204592.56140: done checking for any_errors_fatal 44071 1727204592.56141: checking for max_fail_percentage 44071 1727204592.56142: done checking for max_fail_percentage 44071 1727204592.56143: checking to see if all hosts have failed and the running result is not ok 44071 1727204592.56144: done checking to see if all hosts have failed 44071 1727204592.56144: getting the remaining hosts for this loop 44071 1727204592.56146: done getting the remaining hosts for this loop 44071 1727204592.56151: getting the next task for host managed-node2 44071 1727204592.56159: done getting next task for host managed-node2 44071 1727204592.56161: ^ task is: TASK: Show item 44071 1727204592.56167: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204592.56171: getting variables 44071 1727204592.56173: in VariableManager get_vars() 44071 1727204592.56205: Calling all_inventory to load vars for managed-node2 44071 1727204592.56208: Calling groups_inventory to load vars for managed-node2 44071 1727204592.56212: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204592.56224: Calling all_plugins_play to load vars for managed-node2 44071 1727204592.56226: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204592.56229: Calling groups_plugins_play to load vars for managed-node2 44071 1727204592.56635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204592.57191: done with get_vars() 44071 1727204592.57206: done getting variables 44071 1727204592.57415: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 15:03:12 -0400 (0:00:00.057) 0:00:04.890 ***** 44071 1727204592.57449: entering _queue_task() for managed-node2/debug 44071 1727204592.58103: worker is 1 (out of 1 available) 44071 1727204592.58117: exiting _queue_task() for managed-node2/debug 44071 1727204592.58129: done queuing things up, now waiting for results queue to drain 44071 1727204592.58131: waiting for pending results... 44071 1727204592.58917: running TaskExecutor() for managed-node2/TASK: Show item 44071 1727204592.58923: in run() - task 127b8e07-fff9-c964-7471-000000000092 44071 1727204592.58926: variable 'ansible_search_path' from source: unknown 44071 1727204592.58929: variable 'ansible_search_path' from source: unknown 44071 1727204592.58974: variable 'omit' from source: magic vars 44071 1727204592.59337: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.59356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.59375: variable 'omit' from source: magic vars 44071 1727204592.60311: variable 'ansible_distribution_major_version' from source: facts 44071 1727204592.60434: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204592.60438: variable 'omit' from source: magic vars 44071 1727204592.60651: variable 'omit' from source: magic vars 44071 1727204592.60655: variable 'item' from source: unknown 44071 1727204592.60805: variable 'item' from source: unknown 44071 1727204592.60831: variable 'omit' from source: magic vars 44071 1727204592.60913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204592.61013: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204592.61110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204592.61136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.61189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.61226: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204592.61278: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.61286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.61516: Set connection var ansible_connection to ssh 44071 1727204592.61531: Set connection var ansible_timeout to 10 44071 1727204592.61579: Set connection var ansible_pipelining to False 44071 1727204592.61589: Set connection var ansible_shell_type to sh 44071 1727204592.61599: Set connection var ansible_shell_executable to /bin/sh 44071 1727204592.61611: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204592.61841: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.61844: variable 'ansible_connection' from source: unknown 44071 1727204592.61847: variable 'ansible_module_compression' from source: unknown 44071 1727204592.61849: variable 'ansible_shell_type' from source: unknown 44071 1727204592.61851: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.61853: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.61855: variable 'ansible_pipelining' from source: unknown 44071 1727204592.61858: variable 'ansible_timeout' from source: unknown 44071 1727204592.61861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.62054: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204592.62271: variable 'omit' from source: magic vars 44071 1727204592.62274: starting attempt loop 44071 1727204592.62281: running the handler 44071 1727204592.62284: variable 'lsr_description' from source: include params 44071 1727204592.62436: variable 'lsr_description' from source: include params 44071 1727204592.62454: handler run complete 44071 1727204592.62523: attempt loop complete, returning result 44071 1727204592.62718: variable 'item' from source: unknown 44071 1727204592.62744: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile" } 44071 1727204592.63252: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.63257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.63260: variable 'omit' from source: magic vars 44071 1727204592.63693: variable 'ansible_distribution_major_version' from source: facts 44071 1727204592.63697: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204592.63699: variable 'omit' from source: magic vars 44071 1727204592.63702: variable 'omit' from source: magic vars 44071 1727204592.63909: variable 'item' from source: unknown 44071 1727204592.63913: variable 'item' from source: unknown 44071 1727204592.63915: variable 'omit' from source: magic vars 44071 1727204592.63918: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204592.64025: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.64039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.64146: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204592.64234: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.64238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.64360: Set connection var ansible_connection to ssh 44071 1727204592.64374: Set connection var ansible_timeout to 10 44071 1727204592.64384: Set connection var ansible_pipelining to False 44071 1727204592.64393: Set connection var ansible_shell_type to sh 44071 1727204592.64403: Set connection var ansible_shell_executable to /bin/sh 44071 1727204592.64415: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204592.64443: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.64670: variable 'ansible_connection' from source: unknown 44071 1727204592.64676: variable 'ansible_module_compression' from source: unknown 44071 1727204592.64679: variable 'ansible_shell_type' from source: unknown 44071 1727204592.64681: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.64683: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.64685: variable 'ansible_pipelining' from source: unknown 44071 1727204592.64687: variable 'ansible_timeout' from source: unknown 44071 1727204592.64689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.64973: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204592.64977: variable 'omit' from source: magic vars 44071 1727204592.64979: starting attempt loop 44071 1727204592.64982: running the handler 44071 1727204592.64984: variable 'lsr_setup' from source: include params 44071 1727204592.65048: variable 'lsr_setup' from source: include params 44071 1727204592.65216: handler run complete 44071 1727204592.65238: attempt loop complete, returning result 44071 1727204592.65260: variable 'item' from source: unknown 44071 1727204592.65344: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 44071 1727204592.65844: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.65848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.65851: variable 'omit' from source: magic vars 44071 1727204592.66147: variable 'ansible_distribution_major_version' from source: facts 44071 1727204592.66159: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204592.66177: variable 'omit' from source: magic vars 44071 1727204592.66392: variable 'omit' from source: magic vars 44071 1727204592.66395: variable 'item' from source: unknown 44071 1727204592.66521: variable 'item' from source: unknown 44071 1727204592.66541: variable 'omit' from source: magic vars 44071 1727204592.66569: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204592.66584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.66616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.66636: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204592.66825: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.66828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.66831: Set connection var ansible_connection to ssh 44071 1727204592.66833: Set connection var ansible_timeout to 10 44071 1727204592.66835: Set connection var ansible_pipelining to False 44071 1727204592.66940: Set connection var ansible_shell_type to sh 44071 1727204592.66954: Set connection var ansible_shell_executable to /bin/sh 44071 1727204592.66969: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204592.66999: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.67049: variable 'ansible_connection' from source: unknown 44071 1727204592.67057: variable 'ansible_module_compression' from source: unknown 44071 1727204592.67064: variable 'ansible_shell_type' from source: unknown 44071 1727204592.67073: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.67079: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.67087: variable 'ansible_pipelining' from source: unknown 44071 1727204592.67093: variable 'ansible_timeout' from source: unknown 44071 1727204592.67100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.67251: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204592.67572: variable 'omit' from source: magic vars 44071 1727204592.67575: starting attempt loop 44071 1727204592.67578: running the handler 44071 1727204592.67580: variable 'lsr_test' from source: include params 44071 1727204592.67603: variable 'lsr_test' from source: include params 44071 1727204592.67631: handler run complete 44071 1727204592.67716: attempt loop complete, returning result 44071 1727204592.67742: variable 'item' from source: unknown 44071 1727204592.67938: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile.yml" ] } 44071 1727204592.68353: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.68357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.68359: variable 'omit' from source: magic vars 44071 1727204592.68622: variable 'ansible_distribution_major_version' from source: facts 44071 1727204592.68701: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204592.68711: variable 'omit' from source: magic vars 44071 1727204592.68733: variable 'omit' from source: magic vars 44071 1727204592.68786: variable 'item' from source: unknown 44071 1727204592.69053: variable 'item' from source: unknown 44071 1727204592.69080: variable 'omit' from source: magic vars 44071 1727204592.69106: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204592.69119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.69132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.69153: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204592.69370: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.69374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.69378: Set connection var ansible_connection to ssh 44071 1727204592.69380: Set connection var ansible_timeout to 10 44071 1727204592.69390: Set connection var ansible_pipelining to False 44071 1727204592.69604: Set connection var ansible_shell_type to sh 44071 1727204592.69608: Set connection var ansible_shell_executable to /bin/sh 44071 1727204592.69610: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204592.69612: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.69614: variable 'ansible_connection' from source: unknown 44071 1727204592.69617: variable 'ansible_module_compression' from source: unknown 44071 1727204592.69619: variable 'ansible_shell_type' from source: unknown 44071 1727204592.69621: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.69624: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.69626: variable 'ansible_pipelining' from source: unknown 44071 1727204592.69628: variable 'ansible_timeout' from source: unknown 44071 1727204592.69630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.69694: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204592.69711: variable 'omit' from source: magic vars 44071 1727204592.69720: starting attempt loop 44071 1727204592.69726: running the handler 44071 1727204592.69750: variable 'lsr_assert' from source: include params 44071 1727204592.69883: variable 'lsr_assert' from source: include params 44071 1727204592.69907: handler run complete 44071 1727204592.69930: attempt loop complete, returning result 44071 1727204592.69951: variable 'item' from source: unknown 44071 1727204592.70028: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_present.yml" ] } 44071 1727204592.70404: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.70407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.70409: variable 'omit' from source: magic vars 44071 1727204592.70508: variable 'ansible_distribution_major_version' from source: facts 44071 1727204592.70522: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204592.70583: variable 'omit' from source: magic vars 44071 1727204592.70586: variable 'omit' from source: magic vars 44071 1727204592.70604: variable 'item' from source: unknown 44071 1727204592.70679: variable 'item' from source: unknown 44071 1727204592.70704: variable 'omit' from source: magic vars 44071 1727204592.70732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204592.70745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.70757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.70776: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204592.70798: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.70801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.70880: Set connection var ansible_connection to ssh 44071 1727204592.70907: Set connection var ansible_timeout to 10 44071 1727204592.70910: Set connection var ansible_pipelining to False 44071 1727204592.70912: Set connection var ansible_shell_type to sh 44071 1727204592.70919: Set connection var ansible_shell_executable to /bin/sh 44071 1727204592.70931: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204592.71016: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.71020: variable 'ansible_connection' from source: unknown 44071 1727204592.71022: variable 'ansible_module_compression' from source: unknown 44071 1727204592.71024: variable 'ansible_shell_type' from source: unknown 44071 1727204592.71026: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.71028: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.71030: variable 'ansible_pipelining' from source: unknown 44071 1727204592.71032: variable 'ansible_timeout' from source: unknown 44071 1727204592.71034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.71112: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204592.71130: variable 'omit' from source: magic vars 44071 1727204592.71138: starting attempt loop 44071 1727204592.71145: running the handler 44071 1727204592.71175: variable 'lsr_assert_when' from source: include params 44071 1727204592.71257: variable 'lsr_assert_when' from source: include params 44071 1727204592.71362: variable 'network_provider' from source: set_fact 44071 1727204592.71451: handler run complete 44071 1727204592.71454: attempt loop complete, returning result 44071 1727204592.71457: variable 'item' from source: unknown 44071 1727204592.71524: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_present.yml" } ] } 44071 1727204592.71718: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.71874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.71877: variable 'omit' from source: magic vars 44071 1727204592.71968: variable 'ansible_distribution_major_version' from source: facts 44071 1727204592.71983: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204592.71998: variable 'omit' from source: magic vars 44071 1727204592.72070: variable 'omit' from source: magic vars 44071 1727204592.72073: variable 'item' from source: unknown 44071 1727204592.72130: variable 'item' from source: unknown 44071 1727204592.72151: variable 'omit' from source: magic vars 44071 1727204592.72177: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204592.72191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.72207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.72230: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204592.72372: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.72375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.72424: Set connection var ansible_connection to ssh 44071 1727204592.72493: Set connection var ansible_timeout to 10 44071 1727204592.72504: Set connection var ansible_pipelining to False 44071 1727204592.72513: Set connection var ansible_shell_type to sh 44071 1727204592.72522: Set connection var ansible_shell_executable to /bin/sh 44071 1727204592.72534: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204592.72598: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.72701: variable 'ansible_connection' from source: unknown 44071 1727204592.72704: variable 'ansible_module_compression' from source: unknown 44071 1727204592.72706: variable 'ansible_shell_type' from source: unknown 44071 1727204592.72709: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.72711: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.72713: variable 'ansible_pipelining' from source: unknown 44071 1727204592.72716: variable 'ansible_timeout' from source: unknown 44071 1727204592.72718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.72906: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204592.73027: variable 'omit' from source: magic vars 44071 1727204592.73031: starting attempt loop 44071 1727204592.73033: running the handler 44071 1727204592.73036: variable 'lsr_fail_debug' from source: play vars 44071 1727204592.73184: variable 'lsr_fail_debug' from source: play vars 44071 1727204592.73302: handler run complete 44071 1727204592.73305: attempt loop complete, returning result 44071 1727204592.73411: variable 'item' from source: unknown 44071 1727204592.73538: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 44071 1727204592.73823: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.73987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.73990: variable 'omit' from source: magic vars 44071 1727204592.74197: variable 'ansible_distribution_major_version' from source: facts 44071 1727204592.74209: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204592.74218: variable 'omit' from source: magic vars 44071 1727204592.74240: variable 'omit' from source: magic vars 44071 1727204592.74309: variable 'item' from source: unknown 44071 1727204592.74390: variable 'item' from source: unknown 44071 1727204592.74415: variable 'omit' from source: magic vars 44071 1727204592.74501: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204592.74513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.74516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.74518: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204592.74521: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.74523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.74596: Set connection var ansible_connection to ssh 44071 1727204592.74616: Set connection var ansible_timeout to 10 44071 1727204592.74632: Set connection var ansible_pipelining to False 44071 1727204592.74643: Set connection var ansible_shell_type to sh 44071 1727204592.74654: Set connection var ansible_shell_executable to /bin/sh 44071 1727204592.74669: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204592.74718: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.74721: variable 'ansible_connection' from source: unknown 44071 1727204592.74724: variable 'ansible_module_compression' from source: unknown 44071 1727204592.74728: variable 'ansible_shell_type' from source: unknown 44071 1727204592.74733: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.74736: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.74738: variable 'ansible_pipelining' from source: unknown 44071 1727204592.74771: variable 'ansible_timeout' from source: unknown 44071 1727204592.74774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.74875: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204592.74889: variable 'omit' from source: magic vars 44071 1727204592.74898: starting attempt loop 44071 1727204592.74937: running the handler 44071 1727204592.74940: variable 'lsr_cleanup' from source: include params 44071 1727204592.75021: variable 'lsr_cleanup' from source: include params 44071 1727204592.75049: handler run complete 44071 1727204592.75076: attempt loop complete, returning result 44071 1727204592.75098: variable 'item' from source: unknown 44071 1727204592.75263: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 44071 1727204592.75345: dumping result to json 44071 1727204592.75349: done dumping result, returning 44071 1727204592.75351: done running TaskExecutor() for managed-node2/TASK: Show item [127b8e07-fff9-c964-7471-000000000092] 44071 1727204592.75354: sending task result for task 127b8e07-fff9-c964-7471-000000000092 44071 1727204592.75625: done sending task result for task 127b8e07-fff9-c964-7471-000000000092 44071 1727204592.75628: WORKER PROCESS EXITING 44071 1727204592.75695: no more pending results, returning what we have 44071 1727204592.75699: results queue empty 44071 1727204592.75700: checking for any_errors_fatal 44071 1727204592.75705: done checking for any_errors_fatal 44071 1727204592.75706: checking for max_fail_percentage 44071 1727204592.75707: done checking for max_fail_percentage 44071 1727204592.75708: checking to see if all hosts have failed and the running result is not ok 44071 1727204592.75709: done checking to see if all hosts have failed 44071 1727204592.75710: getting the remaining hosts for this loop 44071 1727204592.75711: done getting the remaining hosts for this loop 44071 1727204592.75716: getting the next task for host managed-node2 44071 1727204592.75723: done getting next task for host managed-node2 44071 1727204592.75726: ^ task is: TASK: Include the task 'show_interfaces.yml' 44071 1727204592.75729: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204592.75732: getting variables 44071 1727204592.75734: in VariableManager get_vars() 44071 1727204592.75884: Calling all_inventory to load vars for managed-node2 44071 1727204592.75888: Calling groups_inventory to load vars for managed-node2 44071 1727204592.75891: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204592.75902: Calling all_plugins_play to load vars for managed-node2 44071 1727204592.75905: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204592.75908: Calling groups_plugins_play to load vars for managed-node2 44071 1727204592.76107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204592.76574: done with get_vars() 44071 1727204592.76587: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 15:03:12 -0400 (0:00:00.193) 0:00:05.084 ***** 44071 1727204592.76806: entering _queue_task() for managed-node2/include_tasks 44071 1727204592.77518: worker is 1 (out of 1 available) 44071 1727204592.77533: exiting _queue_task() for managed-node2/include_tasks 44071 1727204592.77547: done queuing things up, now waiting for results queue to drain 44071 1727204592.77550: waiting for pending results... 44071 1727204592.78177: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 44071 1727204592.78330: in run() - task 127b8e07-fff9-c964-7471-000000000093 44071 1727204592.78346: variable 'ansible_search_path' from source: unknown 44071 1727204592.78351: variable 'ansible_search_path' from source: unknown 44071 1727204592.78398: calling self._execute() 44071 1727204592.78599: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.78608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.78642: variable 'omit' from source: magic vars 44071 1727204592.79614: variable 'ansible_distribution_major_version' from source: facts 44071 1727204592.79628: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204592.79632: _execute() done 44071 1727204592.79635: dumping result to json 44071 1727204592.79735: done dumping result, returning 44071 1727204592.79739: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-c964-7471-000000000093] 44071 1727204592.79742: sending task result for task 127b8e07-fff9-c964-7471-000000000093 44071 1727204592.79979: no more pending results, returning what we have 44071 1727204592.79984: in VariableManager get_vars() 44071 1727204592.80021: Calling all_inventory to load vars for managed-node2 44071 1727204592.80024: Calling groups_inventory to load vars for managed-node2 44071 1727204592.80028: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204592.80043: Calling all_plugins_play to load vars for managed-node2 44071 1727204592.80046: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204592.80049: Calling groups_plugins_play to load vars for managed-node2 44071 1727204592.80586: done sending task result for task 127b8e07-fff9-c964-7471-000000000093 44071 1727204592.80590: WORKER PROCESS EXITING 44071 1727204592.80622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204592.81162: done with get_vars() 44071 1727204592.81175: variable 'ansible_search_path' from source: unknown 44071 1727204592.81177: variable 'ansible_search_path' from source: unknown 44071 1727204592.81224: we have included files to process 44071 1727204592.81225: generating all_blocks data 44071 1727204592.81227: done generating all_blocks data 44071 1727204592.81231: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204592.81233: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204592.81235: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204592.81633: in VariableManager get_vars() 44071 1727204592.81655: done with get_vars() 44071 1727204592.82029: done processing included file 44071 1727204592.82032: iterating over new_blocks loaded from include file 44071 1727204592.82033: in VariableManager get_vars() 44071 1727204592.82051: done with get_vars() 44071 1727204592.82053: filtering new block on tags 44071 1727204592.82097: done filtering new block on tags 44071 1727204592.82100: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 44071 1727204592.82106: extending task lists for all hosts with included blocks 44071 1727204592.83206: done extending task lists 44071 1727204592.83208: done processing included files 44071 1727204592.83209: results queue empty 44071 1727204592.83210: checking for any_errors_fatal 44071 1727204592.83334: done checking for any_errors_fatal 44071 1727204592.83336: checking for max_fail_percentage 44071 1727204592.83337: done checking for max_fail_percentage 44071 1727204592.83338: checking to see if all hosts have failed and the running result is not ok 44071 1727204592.83339: done checking to see if all hosts have failed 44071 1727204592.83340: getting the remaining hosts for this loop 44071 1727204592.83342: done getting the remaining hosts for this loop 44071 1727204592.83345: getting the next task for host managed-node2 44071 1727204592.83351: done getting next task for host managed-node2 44071 1727204592.83353: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 44071 1727204592.83357: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204592.83360: getting variables 44071 1727204592.83361: in VariableManager get_vars() 44071 1727204592.83448: Calling all_inventory to load vars for managed-node2 44071 1727204592.83452: Calling groups_inventory to load vars for managed-node2 44071 1727204592.83455: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204592.83463: Calling all_plugins_play to load vars for managed-node2 44071 1727204592.83467: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204592.83471: Calling groups_plugins_play to load vars for managed-node2 44071 1727204592.83899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204592.84327: done with get_vars() 44071 1727204592.84342: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:03:12 -0400 (0:00:00.076) 0:00:05.161 ***** 44071 1727204592.84492: entering _queue_task() for managed-node2/include_tasks 44071 1727204592.85273: worker is 1 (out of 1 available) 44071 1727204592.85290: exiting _queue_task() for managed-node2/include_tasks 44071 1727204592.85381: done queuing things up, now waiting for results queue to drain 44071 1727204592.85384: waiting for pending results... 44071 1727204592.85784: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 44071 1727204592.86476: in run() - task 127b8e07-fff9-c964-7471-0000000000ba 44071 1727204592.86481: variable 'ansible_search_path' from source: unknown 44071 1727204592.86484: variable 'ansible_search_path' from source: unknown 44071 1727204592.86486: calling self._execute() 44071 1727204592.86489: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.86491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.86493: variable 'omit' from source: magic vars 44071 1727204592.87463: variable 'ansible_distribution_major_version' from source: facts 44071 1727204592.87493: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204592.87506: _execute() done 44071 1727204592.87515: dumping result to json 44071 1727204592.87523: done dumping result, returning 44071 1727204592.87537: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-c964-7471-0000000000ba] 44071 1727204592.87549: sending task result for task 127b8e07-fff9-c964-7471-0000000000ba 44071 1727204592.87707: no more pending results, returning what we have 44071 1727204592.87715: in VariableManager get_vars() 44071 1727204592.87753: Calling all_inventory to load vars for managed-node2 44071 1727204592.87757: Calling groups_inventory to load vars for managed-node2 44071 1727204592.87761: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204592.87780: Calling all_plugins_play to load vars for managed-node2 44071 1727204592.87784: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204592.87787: Calling groups_plugins_play to load vars for managed-node2 44071 1727204592.88180: done sending task result for task 127b8e07-fff9-c964-7471-0000000000ba 44071 1727204592.88184: WORKER PROCESS EXITING 44071 1727204592.88216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204592.88782: done with get_vars() 44071 1727204592.88793: variable 'ansible_search_path' from source: unknown 44071 1727204592.88795: variable 'ansible_search_path' from source: unknown 44071 1727204592.88953: we have included files to process 44071 1727204592.88955: generating all_blocks data 44071 1727204592.88957: done generating all_blocks data 44071 1727204592.88959: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204592.88960: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204592.88964: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204592.89596: done processing included file 44071 1727204592.89598: iterating over new_blocks loaded from include file 44071 1727204592.89600: in VariableManager get_vars() 44071 1727204592.89736: done with get_vars() 44071 1727204592.89739: filtering new block on tags 44071 1727204592.89850: done filtering new block on tags 44071 1727204592.89853: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 44071 1727204592.89860: extending task lists for all hosts with included blocks 44071 1727204592.90301: done extending task lists 44071 1727204592.90303: done processing included files 44071 1727204592.90304: results queue empty 44071 1727204592.90305: checking for any_errors_fatal 44071 1727204592.90309: done checking for any_errors_fatal 44071 1727204592.90310: checking for max_fail_percentage 44071 1727204592.90311: done checking for max_fail_percentage 44071 1727204592.90312: checking to see if all hosts have failed and the running result is not ok 44071 1727204592.90313: done checking to see if all hosts have failed 44071 1727204592.90314: getting the remaining hosts for this loop 44071 1727204592.90315: done getting the remaining hosts for this loop 44071 1727204592.90318: getting the next task for host managed-node2 44071 1727204592.90324: done getting next task for host managed-node2 44071 1727204592.90327: ^ task is: TASK: Gather current interface info 44071 1727204592.90330: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204592.90333: getting variables 44071 1727204592.90334: in VariableManager get_vars() 44071 1727204592.90347: Calling all_inventory to load vars for managed-node2 44071 1727204592.90350: Calling groups_inventory to load vars for managed-node2 44071 1727204592.90519: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204592.90527: Calling all_plugins_play to load vars for managed-node2 44071 1727204592.90530: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204592.90533: Calling groups_plugins_play to load vars for managed-node2 44071 1727204592.91037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204592.91402: done with get_vars() 44071 1727204592.91415: done getting variables 44071 1727204592.91460: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:03:12 -0400 (0:00:00.071) 0:00:05.232 ***** 44071 1727204592.91648: entering _queue_task() for managed-node2/command 44071 1727204592.92394: worker is 1 (out of 1 available) 44071 1727204592.92410: exiting _queue_task() for managed-node2/command 44071 1727204592.92426: done queuing things up, now waiting for results queue to drain 44071 1727204592.92428: waiting for pending results... 44071 1727204592.92823: running TaskExecutor() for managed-node2/TASK: Gather current interface info 44071 1727204592.93174: in run() - task 127b8e07-fff9-c964-7471-0000000000f5 44071 1727204592.93202: variable 'ansible_search_path' from source: unknown 44071 1727204592.93211: variable 'ansible_search_path' from source: unknown 44071 1727204592.93261: calling self._execute() 44071 1727204592.93363: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.93772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.93776: variable 'omit' from source: magic vars 44071 1727204592.94418: variable 'ansible_distribution_major_version' from source: facts 44071 1727204592.94446: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204592.94460: variable 'omit' from source: magic vars 44071 1727204592.94526: variable 'omit' from source: magic vars 44071 1727204592.94817: variable 'omit' from source: magic vars 44071 1727204592.95071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204592.95075: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204592.95078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204592.95081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.95083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204592.95086: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204592.95088: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.95091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.95323: Set connection var ansible_connection to ssh 44071 1727204592.95341: Set connection var ansible_timeout to 10 44071 1727204592.95352: Set connection var ansible_pipelining to False 44071 1727204592.95363: Set connection var ansible_shell_type to sh 44071 1727204592.95377: Set connection var ansible_shell_executable to /bin/sh 44071 1727204592.95391: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204592.95422: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.95772: variable 'ansible_connection' from source: unknown 44071 1727204592.95777: variable 'ansible_module_compression' from source: unknown 44071 1727204592.95780: variable 'ansible_shell_type' from source: unknown 44071 1727204592.95783: variable 'ansible_shell_executable' from source: unknown 44071 1727204592.95786: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204592.95789: variable 'ansible_pipelining' from source: unknown 44071 1727204592.95792: variable 'ansible_timeout' from source: unknown 44071 1727204592.95795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204592.95901: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204592.95920: variable 'omit' from source: magic vars 44071 1727204592.95930: starting attempt loop 44071 1727204592.95940: running the handler 44071 1727204592.95960: _low_level_execute_command(): starting 44071 1727204592.96183: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204592.97504: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204592.97590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204592.97688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204592.97771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204592.97877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204592.99655: stdout chunk (state=3): >>>/root <<< 44071 1727204592.99801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204592.99932: stderr chunk (state=3): >>><<< 44071 1727204592.99947: stdout chunk (state=3): >>><<< 44071 1727204592.99983: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204593.00278: _low_level_execute_command(): starting 44071 1727204593.00282: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204593.0017393-44420-54445266627609 `" && echo ansible-tmp-1727204593.0017393-44420-54445266627609="` echo /root/.ansible/tmp/ansible-tmp-1727204593.0017393-44420-54445266627609 `" ) && sleep 0' 44071 1727204593.01478: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204593.01785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204593.01814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204593.01972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204593.03954: stdout chunk (state=3): >>>ansible-tmp-1727204593.0017393-44420-54445266627609=/root/.ansible/tmp/ansible-tmp-1727204593.0017393-44420-54445266627609 <<< 44071 1727204593.04064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204593.04270: stderr chunk (state=3): >>><<< 44071 1727204593.04285: stdout chunk (state=3): >>><<< 44071 1727204593.04314: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204593.0017393-44420-54445266627609=/root/.ansible/tmp/ansible-tmp-1727204593.0017393-44420-54445266627609 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204593.04356: variable 'ansible_module_compression' from source: unknown 44071 1727204593.04671: ANSIBALLZ: Using generic lock for ansible.legacy.command 44071 1727204593.04675: ANSIBALLZ: Acquiring lock 44071 1727204593.04678: ANSIBALLZ: Lock acquired: 140077513493248 44071 1727204593.04680: ANSIBALLZ: Creating module 44071 1727204593.24153: ANSIBALLZ: Writing module into payload 44071 1727204593.24291: ANSIBALLZ: Writing module 44071 1727204593.24326: ANSIBALLZ: Renaming module 44071 1727204593.24338: ANSIBALLZ: Done creating module 44071 1727204593.24361: variable 'ansible_facts' from source: unknown 44071 1727204593.24452: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204593.0017393-44420-54445266627609/AnsiballZ_command.py 44071 1727204593.24618: Sending initial data 44071 1727204593.24650: Sent initial data (155 bytes) 44071 1727204593.25376: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204593.25422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204593.25437: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204593.25482: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204593.25547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204593.25602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204593.25675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204593.27406: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204593.27470: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204593.27559: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp6khune92 /root/.ansible/tmp/ansible-tmp-1727204593.0017393-44420-54445266627609/AnsiballZ_command.py <<< 44071 1727204593.27563: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204593.0017393-44420-54445266627609/AnsiballZ_command.py" <<< 44071 1727204593.27623: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp6khune92" to remote "/root/.ansible/tmp/ansible-tmp-1727204593.0017393-44420-54445266627609/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204593.0017393-44420-54445266627609/AnsiballZ_command.py" <<< 44071 1727204593.28590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204593.28595: stderr chunk (state=3): >>><<< 44071 1727204593.28730: stdout chunk (state=3): >>><<< 44071 1727204593.28737: done transferring module to remote 44071 1727204593.28740: _low_level_execute_command(): starting 44071 1727204593.28742: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204593.0017393-44420-54445266627609/ /root/.ansible/tmp/ansible-tmp-1727204593.0017393-44420-54445266627609/AnsiballZ_command.py && sleep 0' 44071 1727204593.29415: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204593.29431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204593.29496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204593.29520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204593.29549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204593.29660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204593.31705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204593.31710: stderr chunk (state=3): >>><<< 44071 1727204593.31712: stdout chunk (state=3): >>><<< 44071 1727204593.31715: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204593.31718: _low_level_execute_command(): starting 44071 1727204593.31721: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204593.0017393-44420-54445266627609/AnsiballZ_command.py && sleep 0' 44071 1727204593.32340: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204593.32359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204593.32386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204593.32406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204593.32514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204593.32546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204593.32667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204593.49544: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:13.490630", "end": "2024-09-24 15:03:13.494042", "delta": "0:00:00.003412", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204593.51115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204593.51174: stderr chunk (state=3): >>><<< 44071 1727204593.51179: stdout chunk (state=3): >>><<< 44071 1727204593.51196: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:13.490630", "end": "2024-09-24 15:03:13.494042", "delta": "0:00:00.003412", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204593.51227: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204593.0017393-44420-54445266627609/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204593.51235: _low_level_execute_command(): starting 44071 1727204593.51244: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204593.0017393-44420-54445266627609/ > /dev/null 2>&1 && sleep 0' 44071 1727204593.51974: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204593.51979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204593.51981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204593.51984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204593.51987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204593.51989: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204593.51991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204593.51998: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204593.52002: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204593.52005: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204593.52012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204593.52019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204593.52033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204593.52045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204593.52052: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204593.52062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204593.52199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204593.52203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204593.52280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204593.54373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204593.54378: stdout chunk (state=3): >>><<< 44071 1727204593.54381: stderr chunk (state=3): >>><<< 44071 1727204593.54384: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204593.54386: handler run complete 44071 1727204593.54389: Evaluated conditional (False): False 44071 1727204593.54391: attempt loop complete, returning result 44071 1727204593.54393: _execute() done 44071 1727204593.54396: dumping result to json 44071 1727204593.54402: done dumping result, returning 44071 1727204593.54425: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [127b8e07-fff9-c964-7471-0000000000f5] 44071 1727204593.54428: sending task result for task 127b8e07-fff9-c964-7471-0000000000f5 44071 1727204593.54551: done sending task result for task 127b8e07-fff9-c964-7471-0000000000f5 44071 1727204593.54554: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003412", "end": "2024-09-24 15:03:13.494042", "rc": 0, "start": "2024-09-24 15:03:13.490630" } STDOUT: bonding_masters eth0 lo 44071 1727204593.54773: no more pending results, returning what we have 44071 1727204593.54777: results queue empty 44071 1727204593.54778: checking for any_errors_fatal 44071 1727204593.54780: done checking for any_errors_fatal 44071 1727204593.54780: checking for max_fail_percentage 44071 1727204593.54782: done checking for max_fail_percentage 44071 1727204593.54783: checking to see if all hosts have failed and the running result is not ok 44071 1727204593.54783: done checking to see if all hosts have failed 44071 1727204593.54784: getting the remaining hosts for this loop 44071 1727204593.54786: done getting the remaining hosts for this loop 44071 1727204593.54791: getting the next task for host managed-node2 44071 1727204593.54799: done getting next task for host managed-node2 44071 1727204593.54802: ^ task is: TASK: Set current_interfaces 44071 1727204593.54807: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204593.54811: getting variables 44071 1727204593.54812: in VariableManager get_vars() 44071 1727204593.54846: Calling all_inventory to load vars for managed-node2 44071 1727204593.54850: Calling groups_inventory to load vars for managed-node2 44071 1727204593.54854: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204593.54985: Calling all_plugins_play to load vars for managed-node2 44071 1727204593.54991: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204593.54995: Calling groups_plugins_play to load vars for managed-node2 44071 1727204593.55321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204593.55463: done with get_vars() 44071 1727204593.55475: done getting variables 44071 1727204593.55521: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.638) 0:00:05.871 ***** 44071 1727204593.55548: entering _queue_task() for managed-node2/set_fact 44071 1727204593.55793: worker is 1 (out of 1 available) 44071 1727204593.55807: exiting _queue_task() for managed-node2/set_fact 44071 1727204593.55821: done queuing things up, now waiting for results queue to drain 44071 1727204593.55823: waiting for pending results... 44071 1727204593.55997: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 44071 1727204593.56077: in run() - task 127b8e07-fff9-c964-7471-0000000000f6 44071 1727204593.56089: variable 'ansible_search_path' from source: unknown 44071 1727204593.56093: variable 'ansible_search_path' from source: unknown 44071 1727204593.56125: calling self._execute() 44071 1727204593.56196: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204593.56202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204593.56211: variable 'omit' from source: magic vars 44071 1727204593.56508: variable 'ansible_distribution_major_version' from source: facts 44071 1727204593.56520: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204593.56526: variable 'omit' from source: magic vars 44071 1727204593.56569: variable 'omit' from source: magic vars 44071 1727204593.56654: variable '_current_interfaces' from source: set_fact 44071 1727204593.56706: variable 'omit' from source: magic vars 44071 1727204593.56743: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204593.56774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204593.56791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204593.56807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204593.56818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204593.56846: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204593.56849: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204593.56852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204593.56925: Set connection var ansible_connection to ssh 44071 1727204593.56930: Set connection var ansible_timeout to 10 44071 1727204593.56940: Set connection var ansible_pipelining to False 44071 1727204593.56946: Set connection var ansible_shell_type to sh 44071 1727204593.56952: Set connection var ansible_shell_executable to /bin/sh 44071 1727204593.56958: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204593.56979: variable 'ansible_shell_executable' from source: unknown 44071 1727204593.56983: variable 'ansible_connection' from source: unknown 44071 1727204593.56986: variable 'ansible_module_compression' from source: unknown 44071 1727204593.56988: variable 'ansible_shell_type' from source: unknown 44071 1727204593.56991: variable 'ansible_shell_executable' from source: unknown 44071 1727204593.56993: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204593.56995: variable 'ansible_pipelining' from source: unknown 44071 1727204593.57000: variable 'ansible_timeout' from source: unknown 44071 1727204593.57004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204593.57120: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204593.57132: variable 'omit' from source: magic vars 44071 1727204593.57135: starting attempt loop 44071 1727204593.57138: running the handler 44071 1727204593.57157: handler run complete 44071 1727204593.57160: attempt loop complete, returning result 44071 1727204593.57162: _execute() done 44071 1727204593.57165: dumping result to json 44071 1727204593.57172: done dumping result, returning 44071 1727204593.57180: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [127b8e07-fff9-c964-7471-0000000000f6] 44071 1727204593.57183: sending task result for task 127b8e07-fff9-c964-7471-0000000000f6 44071 1727204593.57275: done sending task result for task 127b8e07-fff9-c964-7471-0000000000f6 44071 1727204593.57278: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 44071 1727204593.57336: no more pending results, returning what we have 44071 1727204593.57339: results queue empty 44071 1727204593.57340: checking for any_errors_fatal 44071 1727204593.57351: done checking for any_errors_fatal 44071 1727204593.57351: checking for max_fail_percentage 44071 1727204593.57353: done checking for max_fail_percentage 44071 1727204593.57353: checking to see if all hosts have failed and the running result is not ok 44071 1727204593.57354: done checking to see if all hosts have failed 44071 1727204593.57355: getting the remaining hosts for this loop 44071 1727204593.57357: done getting the remaining hosts for this loop 44071 1727204593.57361: getting the next task for host managed-node2 44071 1727204593.57371: done getting next task for host managed-node2 44071 1727204593.57374: ^ task is: TASK: Show current_interfaces 44071 1727204593.57377: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204593.57380: getting variables 44071 1727204593.57382: in VariableManager get_vars() 44071 1727204593.57409: Calling all_inventory to load vars for managed-node2 44071 1727204593.57412: Calling groups_inventory to load vars for managed-node2 44071 1727204593.57415: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204593.57425: Calling all_plugins_play to load vars for managed-node2 44071 1727204593.57427: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204593.57430: Calling groups_plugins_play to load vars for managed-node2 44071 1727204593.57590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204593.57731: done with get_vars() 44071 1727204593.57741: done getting variables 44071 1727204593.57787: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.022) 0:00:05.894 ***** 44071 1727204593.57811: entering _queue_task() for managed-node2/debug 44071 1727204593.58048: worker is 1 (out of 1 available) 44071 1727204593.58063: exiting _queue_task() for managed-node2/debug 44071 1727204593.58079: done queuing things up, now waiting for results queue to drain 44071 1727204593.58081: waiting for pending results... 44071 1727204593.58252: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 44071 1727204593.58327: in run() - task 127b8e07-fff9-c964-7471-0000000000bb 44071 1727204593.58342: variable 'ansible_search_path' from source: unknown 44071 1727204593.58346: variable 'ansible_search_path' from source: unknown 44071 1727204593.58380: calling self._execute() 44071 1727204593.58449: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204593.58455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204593.58464: variable 'omit' from source: magic vars 44071 1727204593.58757: variable 'ansible_distribution_major_version' from source: facts 44071 1727204593.58769: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204593.58776: variable 'omit' from source: magic vars 44071 1727204593.58809: variable 'omit' from source: magic vars 44071 1727204593.58892: variable 'current_interfaces' from source: set_fact 44071 1727204593.58914: variable 'omit' from source: magic vars 44071 1727204593.58951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204593.58985: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204593.59002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204593.59018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204593.59028: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204593.59055: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204593.59059: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204593.59061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204593.59137: Set connection var ansible_connection to ssh 44071 1727204593.59144: Set connection var ansible_timeout to 10 44071 1727204593.59150: Set connection var ansible_pipelining to False 44071 1727204593.59156: Set connection var ansible_shell_type to sh 44071 1727204593.59161: Set connection var ansible_shell_executable to /bin/sh 44071 1727204593.59170: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204593.59195: variable 'ansible_shell_executable' from source: unknown 44071 1727204593.59199: variable 'ansible_connection' from source: unknown 44071 1727204593.59201: variable 'ansible_module_compression' from source: unknown 44071 1727204593.59204: variable 'ansible_shell_type' from source: unknown 44071 1727204593.59206: variable 'ansible_shell_executable' from source: unknown 44071 1727204593.59209: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204593.59211: variable 'ansible_pipelining' from source: unknown 44071 1727204593.59213: variable 'ansible_timeout' from source: unknown 44071 1727204593.59215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204593.59333: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204593.59346: variable 'omit' from source: magic vars 44071 1727204593.59349: starting attempt loop 44071 1727204593.59352: running the handler 44071 1727204593.59395: handler run complete 44071 1727204593.59412: attempt loop complete, returning result 44071 1727204593.59415: _execute() done 44071 1727204593.59418: dumping result to json 44071 1727204593.59420: done dumping result, returning 44071 1727204593.59427: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [127b8e07-fff9-c964-7471-0000000000bb] 44071 1727204593.59432: sending task result for task 127b8e07-fff9-c964-7471-0000000000bb 44071 1727204593.59527: done sending task result for task 127b8e07-fff9-c964-7471-0000000000bb 44071 1727204593.59530: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 44071 1727204593.59582: no more pending results, returning what we have 44071 1727204593.59585: results queue empty 44071 1727204593.59586: checking for any_errors_fatal 44071 1727204593.59592: done checking for any_errors_fatal 44071 1727204593.59592: checking for max_fail_percentage 44071 1727204593.59594: done checking for max_fail_percentage 44071 1727204593.59594: checking to see if all hosts have failed and the running result is not ok 44071 1727204593.59595: done checking to see if all hosts have failed 44071 1727204593.59596: getting the remaining hosts for this loop 44071 1727204593.59597: done getting the remaining hosts for this loop 44071 1727204593.59602: getting the next task for host managed-node2 44071 1727204593.59611: done getting next task for host managed-node2 44071 1727204593.59614: ^ task is: TASK: Setup 44071 1727204593.59617: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204593.59622: getting variables 44071 1727204593.59624: in VariableManager get_vars() 44071 1727204593.59653: Calling all_inventory to load vars for managed-node2 44071 1727204593.59656: Calling groups_inventory to load vars for managed-node2 44071 1727204593.59659: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204593.59672: Calling all_plugins_play to load vars for managed-node2 44071 1727204593.59674: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204593.59677: Calling groups_plugins_play to load vars for managed-node2 44071 1727204593.59865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204593.60007: done with get_vars() 44071 1727204593.60015: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.022) 0:00:05.917 ***** 44071 1727204593.60086: entering _queue_task() for managed-node2/include_tasks 44071 1727204593.60324: worker is 1 (out of 1 available) 44071 1727204593.60337: exiting _queue_task() for managed-node2/include_tasks 44071 1727204593.60352: done queuing things up, now waiting for results queue to drain 44071 1727204593.60353: waiting for pending results... 44071 1727204593.60529: running TaskExecutor() for managed-node2/TASK: Setup 44071 1727204593.60602: in run() - task 127b8e07-fff9-c964-7471-000000000094 44071 1727204593.60615: variable 'ansible_search_path' from source: unknown 44071 1727204593.60619: variable 'ansible_search_path' from source: unknown 44071 1727204593.60659: variable 'lsr_setup' from source: include params 44071 1727204593.60831: variable 'lsr_setup' from source: include params 44071 1727204593.60888: variable 'omit' from source: magic vars 44071 1727204593.60982: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204593.60991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204593.61000: variable 'omit' from source: magic vars 44071 1727204593.61185: variable 'ansible_distribution_major_version' from source: facts 44071 1727204593.61194: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204593.61199: variable 'item' from source: unknown 44071 1727204593.61256: variable 'item' from source: unknown 44071 1727204593.61285: variable 'item' from source: unknown 44071 1727204593.61330: variable 'item' from source: unknown 44071 1727204593.61471: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204593.61475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204593.61477: variable 'omit' from source: magic vars 44071 1727204593.61557: variable 'ansible_distribution_major_version' from source: facts 44071 1727204593.61561: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204593.61569: variable 'item' from source: unknown 44071 1727204593.61617: variable 'item' from source: unknown 44071 1727204593.61641: variable 'item' from source: unknown 44071 1727204593.61686: variable 'item' from source: unknown 44071 1727204593.61762: dumping result to json 44071 1727204593.61765: done dumping result, returning 44071 1727204593.61769: done running TaskExecutor() for managed-node2/TASK: Setup [127b8e07-fff9-c964-7471-000000000094] 44071 1727204593.61771: sending task result for task 127b8e07-fff9-c964-7471-000000000094 44071 1727204593.61808: done sending task result for task 127b8e07-fff9-c964-7471-000000000094 44071 1727204593.61811: WORKER PROCESS EXITING 44071 1727204593.61840: no more pending results, returning what we have 44071 1727204593.61844: in VariableManager get_vars() 44071 1727204593.61883: Calling all_inventory to load vars for managed-node2 44071 1727204593.61887: Calling groups_inventory to load vars for managed-node2 44071 1727204593.61890: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204593.61904: Calling all_plugins_play to load vars for managed-node2 44071 1727204593.61907: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204593.61910: Calling groups_plugins_play to load vars for managed-node2 44071 1727204593.62090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204593.62224: done with get_vars() 44071 1727204593.62231: variable 'ansible_search_path' from source: unknown 44071 1727204593.62232: variable 'ansible_search_path' from source: unknown 44071 1727204593.62268: variable 'ansible_search_path' from source: unknown 44071 1727204593.62269: variable 'ansible_search_path' from source: unknown 44071 1727204593.62288: we have included files to process 44071 1727204593.62289: generating all_blocks data 44071 1727204593.62290: done generating all_blocks data 44071 1727204593.62294: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 44071 1727204593.62295: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 44071 1727204593.62296: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 44071 1727204593.62467: done processing included file 44071 1727204593.62469: iterating over new_blocks loaded from include file 44071 1727204593.62470: in VariableManager get_vars() 44071 1727204593.62481: done with get_vars() 44071 1727204593.62482: filtering new block on tags 44071 1727204593.62498: done filtering new block on tags 44071 1727204593.62500: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node2 => (item=tasks/delete_interface.yml) 44071 1727204593.62503: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44071 1727204593.62504: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44071 1727204593.62506: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44071 1727204593.62618: in VariableManager get_vars() 44071 1727204593.62632: done with get_vars() 44071 1727204593.62719: done processing included file 44071 1727204593.62720: iterating over new_blocks loaded from include file 44071 1727204593.62721: in VariableManager get_vars() 44071 1727204593.62729: done with get_vars() 44071 1727204593.62731: filtering new block on tags 44071 1727204593.62755: done filtering new block on tags 44071 1727204593.62756: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 => (item=tasks/assert_device_absent.yml) 44071 1727204593.62759: extending task lists for all hosts with included blocks 44071 1727204593.63146: done extending task lists 44071 1727204593.63147: done processing included files 44071 1727204593.63147: results queue empty 44071 1727204593.63148: checking for any_errors_fatal 44071 1727204593.63150: done checking for any_errors_fatal 44071 1727204593.63151: checking for max_fail_percentage 44071 1727204593.63152: done checking for max_fail_percentage 44071 1727204593.63152: checking to see if all hosts have failed and the running result is not ok 44071 1727204593.63153: done checking to see if all hosts have failed 44071 1727204593.63153: getting the remaining hosts for this loop 44071 1727204593.63154: done getting the remaining hosts for this loop 44071 1727204593.63156: getting the next task for host managed-node2 44071 1727204593.63159: done getting next task for host managed-node2 44071 1727204593.63161: ^ task is: TASK: Remove test interface if necessary 44071 1727204593.63163: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204593.63165: getting variables 44071 1727204593.63167: in VariableManager get_vars() 44071 1727204593.63179: Calling all_inventory to load vars for managed-node2 44071 1727204593.63180: Calling groups_inventory to load vars for managed-node2 44071 1727204593.63182: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204593.63186: Calling all_plugins_play to load vars for managed-node2 44071 1727204593.63188: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204593.63189: Calling groups_plugins_play to load vars for managed-node2 44071 1727204593.63309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204593.63447: done with get_vars() 44071 1727204593.63454: done getting variables 44071 1727204593.63486: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 15:03:13 -0400 (0:00:00.034) 0:00:05.951 ***** 44071 1727204593.63509: entering _queue_task() for managed-node2/command 44071 1727204593.63773: worker is 1 (out of 1 available) 44071 1727204593.63790: exiting _queue_task() for managed-node2/command 44071 1727204593.63804: done queuing things up, now waiting for results queue to drain 44071 1727204593.63805: waiting for pending results... 44071 1727204593.63970: running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary 44071 1727204593.64048: in run() - task 127b8e07-fff9-c964-7471-00000000011b 44071 1727204593.64057: variable 'ansible_search_path' from source: unknown 44071 1727204593.64060: variable 'ansible_search_path' from source: unknown 44071 1727204593.64093: calling self._execute() 44071 1727204593.64159: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204593.64164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204593.64174: variable 'omit' from source: magic vars 44071 1727204593.64470: variable 'ansible_distribution_major_version' from source: facts 44071 1727204593.64483: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204593.64490: variable 'omit' from source: magic vars 44071 1727204593.64523: variable 'omit' from source: magic vars 44071 1727204593.64604: variable 'interface' from source: play vars 44071 1727204593.64618: variable 'omit' from source: magic vars 44071 1727204593.64654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204593.64686: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204593.64706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204593.64721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204593.64732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204593.64757: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204593.64760: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204593.64763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204593.64843: Set connection var ansible_connection to ssh 44071 1727204593.64849: Set connection var ansible_timeout to 10 44071 1727204593.64855: Set connection var ansible_pipelining to False 44071 1727204593.64860: Set connection var ansible_shell_type to sh 44071 1727204593.64868: Set connection var ansible_shell_executable to /bin/sh 44071 1727204593.64875: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204593.64895: variable 'ansible_shell_executable' from source: unknown 44071 1727204593.64898: variable 'ansible_connection' from source: unknown 44071 1727204593.64901: variable 'ansible_module_compression' from source: unknown 44071 1727204593.64903: variable 'ansible_shell_type' from source: unknown 44071 1727204593.64906: variable 'ansible_shell_executable' from source: unknown 44071 1727204593.64910: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204593.64912: variable 'ansible_pipelining' from source: unknown 44071 1727204593.64915: variable 'ansible_timeout' from source: unknown 44071 1727204593.64925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204593.65032: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204593.65049: variable 'omit' from source: magic vars 44071 1727204593.65053: starting attempt loop 44071 1727204593.65056: running the handler 44071 1727204593.65073: _low_level_execute_command(): starting 44071 1727204593.65080: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204593.65649: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204593.65654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204593.65657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204593.65661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204593.65713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204593.65716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204593.65792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204593.67476: stdout chunk (state=3): >>>/root <<< 44071 1727204593.67584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204593.67652: stderr chunk (state=3): >>><<< 44071 1727204593.67658: stdout chunk (state=3): >>><<< 44071 1727204593.67680: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204593.67692: _low_level_execute_command(): starting 44071 1727204593.67699: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204593.6768012-44451-164408587066994 `" && echo ansible-tmp-1727204593.6768012-44451-164408587066994="` echo /root/.ansible/tmp/ansible-tmp-1727204593.6768012-44451-164408587066994 `" ) && sleep 0' 44071 1727204593.68204: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204593.68208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204593.68219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204593.68222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204593.68282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204593.68287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204593.68289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204593.68356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204593.70332: stdout chunk (state=3): >>>ansible-tmp-1727204593.6768012-44451-164408587066994=/root/.ansible/tmp/ansible-tmp-1727204593.6768012-44451-164408587066994 <<< 44071 1727204593.70453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204593.70517: stderr chunk (state=3): >>><<< 44071 1727204593.70521: stdout chunk (state=3): >>><<< 44071 1727204593.70541: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204593.6768012-44451-164408587066994=/root/.ansible/tmp/ansible-tmp-1727204593.6768012-44451-164408587066994 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204593.70571: variable 'ansible_module_compression' from source: unknown 44071 1727204593.70619: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204593.70653: variable 'ansible_facts' from source: unknown 44071 1727204593.70713: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204593.6768012-44451-164408587066994/AnsiballZ_command.py 44071 1727204593.70827: Sending initial data 44071 1727204593.70830: Sent initial data (156 bytes) 44071 1727204593.71341: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204593.71344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204593.71347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204593.71349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204593.71351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204593.71418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204593.71422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204593.71429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204593.71498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204593.73121: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204593.73186: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204593.73256: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpqtbvhu77 /root/.ansible/tmp/ansible-tmp-1727204593.6768012-44451-164408587066994/AnsiballZ_command.py <<< 44071 1727204593.73260: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204593.6768012-44451-164408587066994/AnsiballZ_command.py" <<< 44071 1727204593.73322: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpqtbvhu77" to remote "/root/.ansible/tmp/ansible-tmp-1727204593.6768012-44451-164408587066994/AnsiballZ_command.py" <<< 44071 1727204593.73327: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204593.6768012-44451-164408587066994/AnsiballZ_command.py" <<< 44071 1727204593.73998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204593.74076: stderr chunk (state=3): >>><<< 44071 1727204593.74080: stdout chunk (state=3): >>><<< 44071 1727204593.74103: done transferring module to remote 44071 1727204593.74116: _low_level_execute_command(): starting 44071 1727204593.74119: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204593.6768012-44451-164408587066994/ /root/.ansible/tmp/ansible-tmp-1727204593.6768012-44451-164408587066994/AnsiballZ_command.py && sleep 0' 44071 1727204593.74610: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204593.74614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204593.74617: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204593.74624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204593.74626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204593.74674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204593.74687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204593.74694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204593.74755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204593.76576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204593.76639: stderr chunk (state=3): >>><<< 44071 1727204593.76643: stdout chunk (state=3): >>><<< 44071 1727204593.76655: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204593.76658: _low_level_execute_command(): starting 44071 1727204593.76664: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204593.6768012-44451-164408587066994/AnsiballZ_command.py && sleep 0' 44071 1727204593.77174: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204593.77178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204593.77181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204593.77195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204593.77245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204593.77248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204593.77250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204593.77330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204593.94708: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-24 15:03:13.937758", "end": "2024-09-24 15:03:13.945508", "delta": "0:00:00.007750", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204593.96240: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. <<< 44071 1727204593.96245: stdout chunk (state=3): >>><<< 44071 1727204593.96248: stderr chunk (state=3): >>><<< 44071 1727204593.96406: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-24 15:03:13.937758", "end": "2024-09-24 15:03:13.945508", "delta": "0:00:00.007750", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. 44071 1727204593.96410: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204593.6768012-44451-164408587066994/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204593.96413: _low_level_execute_command(): starting 44071 1727204593.96416: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204593.6768012-44451-164408587066994/ > /dev/null 2>&1 && sleep 0' 44071 1727204593.97004: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204593.97020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204593.97042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204593.97069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204593.97089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204593.97177: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204593.97205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204593.97227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204593.97250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204593.97358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204593.99273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204593.99383: stderr chunk (state=3): >>><<< 44071 1727204593.99397: stdout chunk (state=3): >>><<< 44071 1727204593.99423: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204593.99435: handler run complete 44071 1727204593.99471: Evaluated conditional (False): False 44071 1727204593.99488: attempt loop complete, returning result 44071 1727204593.99501: _execute() done 44071 1727204593.99509: dumping result to json 44071 1727204593.99570: done dumping result, returning 44071 1727204593.99573: done running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary [127b8e07-fff9-c964-7471-00000000011b] 44071 1727204593.99576: sending task result for task 127b8e07-fff9-c964-7471-00000000011b fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.007750", "end": "2024-09-24 15:03:13.945508", "rc": 1, "start": "2024-09-24 15:03:13.937758" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 44071 1727204593.99744: no more pending results, returning what we have 44071 1727204593.99748: results queue empty 44071 1727204593.99749: checking for any_errors_fatal 44071 1727204593.99751: done checking for any_errors_fatal 44071 1727204593.99751: checking for max_fail_percentage 44071 1727204593.99753: done checking for max_fail_percentage 44071 1727204593.99754: checking to see if all hosts have failed and the running result is not ok 44071 1727204593.99755: done checking to see if all hosts have failed 44071 1727204593.99756: getting the remaining hosts for this loop 44071 1727204593.99757: done getting the remaining hosts for this loop 44071 1727204593.99763: getting the next task for host managed-node2 44071 1727204593.99776: done getting next task for host managed-node2 44071 1727204593.99779: ^ task is: TASK: Include the task 'get_interface_stat.yml' 44071 1727204593.99785: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204593.99789: getting variables 44071 1727204593.99790: in VariableManager get_vars() 44071 1727204593.99822: Calling all_inventory to load vars for managed-node2 44071 1727204593.99825: Calling groups_inventory to load vars for managed-node2 44071 1727204593.99829: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204593.99844: Calling all_plugins_play to load vars for managed-node2 44071 1727204593.99848: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204593.99851: Calling groups_plugins_play to load vars for managed-node2 44071 1727204594.00432: done sending task result for task 127b8e07-fff9-c964-7471-00000000011b 44071 1727204594.00436: WORKER PROCESS EXITING 44071 1727204594.00463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204594.00729: done with get_vars() 44071 1727204594.00746: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.373) 0:00:06.325 ***** 44071 1727204594.00854: entering _queue_task() for managed-node2/include_tasks 44071 1727204594.01299: worker is 1 (out of 1 available) 44071 1727204594.01312: exiting _queue_task() for managed-node2/include_tasks 44071 1727204594.01324: done queuing things up, now waiting for results queue to drain 44071 1727204594.01326: waiting for pending results... 44071 1727204594.01522: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 44071 1727204594.01665: in run() - task 127b8e07-fff9-c964-7471-00000000011f 44071 1727204594.01690: variable 'ansible_search_path' from source: unknown 44071 1727204594.01697: variable 'ansible_search_path' from source: unknown 44071 1727204594.01749: calling self._execute() 44071 1727204594.01848: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.01861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.01878: variable 'omit' from source: magic vars 44071 1727204594.02394: variable 'ansible_distribution_major_version' from source: facts 44071 1727204594.02414: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204594.02430: _execute() done 44071 1727204594.02441: dumping result to json 44071 1727204594.02450: done dumping result, returning 44071 1727204594.02468: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-c964-7471-00000000011f] 44071 1727204594.02478: sending task result for task 127b8e07-fff9-c964-7471-00000000011f 44071 1727204594.02701: done sending task result for task 127b8e07-fff9-c964-7471-00000000011f 44071 1727204594.02704: WORKER PROCESS EXITING 44071 1727204594.02736: no more pending results, returning what we have 44071 1727204594.02745: in VariableManager get_vars() 44071 1727204594.02872: Calling all_inventory to load vars for managed-node2 44071 1727204594.02876: Calling groups_inventory to load vars for managed-node2 44071 1727204594.02880: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204594.02898: Calling all_plugins_play to load vars for managed-node2 44071 1727204594.02901: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204594.02904: Calling groups_plugins_play to load vars for managed-node2 44071 1727204594.03313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204594.03563: done with get_vars() 44071 1727204594.03582: variable 'ansible_search_path' from source: unknown 44071 1727204594.03584: variable 'ansible_search_path' from source: unknown 44071 1727204594.03595: variable 'item' from source: include params 44071 1727204594.03741: variable 'item' from source: include params 44071 1727204594.03792: we have included files to process 44071 1727204594.03794: generating all_blocks data 44071 1727204594.03796: done generating all_blocks data 44071 1727204594.03801: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204594.03802: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204594.03805: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204594.04102: done processing included file 44071 1727204594.04105: iterating over new_blocks loaded from include file 44071 1727204594.04106: in VariableManager get_vars() 44071 1727204594.04125: done with get_vars() 44071 1727204594.04127: filtering new block on tags 44071 1727204594.04160: done filtering new block on tags 44071 1727204594.04164: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 44071 1727204594.04173: extending task lists for all hosts with included blocks 44071 1727204594.04372: done extending task lists 44071 1727204594.04374: done processing included files 44071 1727204594.04374: results queue empty 44071 1727204594.04375: checking for any_errors_fatal 44071 1727204594.04381: done checking for any_errors_fatal 44071 1727204594.04381: checking for max_fail_percentage 44071 1727204594.04383: done checking for max_fail_percentage 44071 1727204594.04383: checking to see if all hosts have failed and the running result is not ok 44071 1727204594.04384: done checking to see if all hosts have failed 44071 1727204594.04385: getting the remaining hosts for this loop 44071 1727204594.04386: done getting the remaining hosts for this loop 44071 1727204594.04389: getting the next task for host managed-node2 44071 1727204594.04394: done getting next task for host managed-node2 44071 1727204594.04397: ^ task is: TASK: Get stat for interface {{ interface }} 44071 1727204594.04400: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204594.04403: getting variables 44071 1727204594.04404: in VariableManager get_vars() 44071 1727204594.04417: Calling all_inventory to load vars for managed-node2 44071 1727204594.04424: Calling groups_inventory to load vars for managed-node2 44071 1727204594.04427: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204594.04434: Calling all_plugins_play to load vars for managed-node2 44071 1727204594.04439: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204594.04444: Calling groups_plugins_play to load vars for managed-node2 44071 1727204594.04630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204594.05074: done with get_vars() 44071 1727204594.05090: done getting variables 44071 1727204594.05241: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.044) 0:00:06.369 ***** 44071 1727204594.05274: entering _queue_task() for managed-node2/stat 44071 1727204594.05772: worker is 1 (out of 1 available) 44071 1727204594.05784: exiting _queue_task() for managed-node2/stat 44071 1727204594.05795: done queuing things up, now waiting for results queue to drain 44071 1727204594.05796: waiting for pending results... 44071 1727204594.05988: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 44071 1727204594.06102: in run() - task 127b8e07-fff9-c964-7471-00000000016e 44071 1727204594.06131: variable 'ansible_search_path' from source: unknown 44071 1727204594.06192: variable 'ansible_search_path' from source: unknown 44071 1727204594.06196: calling self._execute() 44071 1727204594.06285: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.06301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.06316: variable 'omit' from source: magic vars 44071 1727204594.06769: variable 'ansible_distribution_major_version' from source: facts 44071 1727204594.06794: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204594.06805: variable 'omit' from source: magic vars 44071 1727204594.06891: variable 'omit' from source: magic vars 44071 1727204594.07064: variable 'interface' from source: play vars 44071 1727204594.07070: variable 'omit' from source: magic vars 44071 1727204594.07093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204594.07142: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204594.07174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204594.07198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204594.07221: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204594.07259: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204594.07269: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.07325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.07404: Set connection var ansible_connection to ssh 44071 1727204594.07415: Set connection var ansible_timeout to 10 44071 1727204594.07425: Set connection var ansible_pipelining to False 44071 1727204594.07498: Set connection var ansible_shell_type to sh 44071 1727204594.07501: Set connection var ansible_shell_executable to /bin/sh 44071 1727204594.07503: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204594.07506: variable 'ansible_shell_executable' from source: unknown 44071 1727204594.07508: variable 'ansible_connection' from source: unknown 44071 1727204594.07511: variable 'ansible_module_compression' from source: unknown 44071 1727204594.07512: variable 'ansible_shell_type' from source: unknown 44071 1727204594.07515: variable 'ansible_shell_executable' from source: unknown 44071 1727204594.07517: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.07525: variable 'ansible_pipelining' from source: unknown 44071 1727204594.07532: variable 'ansible_timeout' from source: unknown 44071 1727204594.07547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.07788: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204594.07823: variable 'omit' from source: magic vars 44071 1727204594.07826: starting attempt loop 44071 1727204594.07828: running the handler 44071 1727204594.07869: _low_level_execute_command(): starting 44071 1727204594.07872: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204594.08772: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204594.08797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204594.08818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204594.08849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204594.08968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204594.10687: stdout chunk (state=3): >>>/root <<< 44071 1727204594.10773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204594.10970: stderr chunk (state=3): >>><<< 44071 1727204594.10978: stdout chunk (state=3): >>><<< 44071 1727204594.10982: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204594.10985: _low_level_execute_command(): starting 44071 1727204594.10989: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204594.1090822-44465-158518671209345 `" && echo ansible-tmp-1727204594.1090822-44465-158518671209345="` echo /root/.ansible/tmp/ansible-tmp-1727204594.1090822-44465-158518671209345 `" ) && sleep 0' 44071 1727204594.11673: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204594.11692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204594.11705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204594.11723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204594.11743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204594.11768: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204594.11809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204594.11821: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204594.11878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204594.11895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204594.11988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204594.13997: stdout chunk (state=3): >>>ansible-tmp-1727204594.1090822-44465-158518671209345=/root/.ansible/tmp/ansible-tmp-1727204594.1090822-44465-158518671209345 <<< 44071 1727204594.14195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204594.14199: stdout chunk (state=3): >>><<< 44071 1727204594.14202: stderr chunk (state=3): >>><<< 44071 1727204594.14222: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204594.1090822-44465-158518671209345=/root/.ansible/tmp/ansible-tmp-1727204594.1090822-44465-158518671209345 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204594.14294: variable 'ansible_module_compression' from source: unknown 44071 1727204594.14372: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44071 1727204594.14418: variable 'ansible_facts' from source: unknown 44071 1727204594.14556: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204594.1090822-44465-158518671209345/AnsiballZ_stat.py 44071 1727204594.14794: Sending initial data 44071 1727204594.14805: Sent initial data (153 bytes) 44071 1727204594.15318: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204594.15365: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204594.15424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204594.15427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204594.15430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204594.15503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204594.17099: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204594.17174: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204594.17273: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpkm1r_kct /root/.ansible/tmp/ansible-tmp-1727204594.1090822-44465-158518671209345/AnsiballZ_stat.py <<< 44071 1727204594.17283: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204594.1090822-44465-158518671209345/AnsiballZ_stat.py" <<< 44071 1727204594.17347: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpkm1r_kct" to remote "/root/.ansible/tmp/ansible-tmp-1727204594.1090822-44465-158518671209345/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204594.1090822-44465-158518671209345/AnsiballZ_stat.py" <<< 44071 1727204594.18320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204594.18335: stderr chunk (state=3): >>><<< 44071 1727204594.18473: stdout chunk (state=3): >>><<< 44071 1727204594.18477: done transferring module to remote 44071 1727204594.18479: _low_level_execute_command(): starting 44071 1727204594.18482: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204594.1090822-44465-158518671209345/ /root/.ansible/tmp/ansible-tmp-1727204594.1090822-44465-158518671209345/AnsiballZ_stat.py && sleep 0' 44071 1727204594.19130: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204594.19162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204594.19182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204594.19202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204594.19273: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204594.19329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204594.19348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204594.19382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204594.19492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204594.21477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204594.21482: stdout chunk (state=3): >>><<< 44071 1727204594.21484: stderr chunk (state=3): >>><<< 44071 1727204594.21486: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204594.21491: _low_level_execute_command(): starting 44071 1727204594.21503: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204594.1090822-44465-158518671209345/AnsiballZ_stat.py && sleep 0' 44071 1727204594.22212: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204594.22229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204594.22257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204594.22375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204594.22403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204594.22520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204594.39009: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44071 1727204594.40281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204594.40344: stderr chunk (state=3): >>><<< 44071 1727204594.40348: stdout chunk (state=3): >>><<< 44071 1727204594.40361: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204594.40390: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204594.1090822-44465-158518671209345/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204594.40400: _low_level_execute_command(): starting 44071 1727204594.40406: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204594.1090822-44465-158518671209345/ > /dev/null 2>&1 && sleep 0' 44071 1727204594.40923: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204594.40928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204594.40931: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204594.40933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204594.40989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204594.40993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204594.41075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204594.42981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204594.43046: stderr chunk (state=3): >>><<< 44071 1727204594.43050: stdout chunk (state=3): >>><<< 44071 1727204594.43067: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204594.43074: handler run complete 44071 1727204594.43091: attempt loop complete, returning result 44071 1727204594.43094: _execute() done 44071 1727204594.43099: dumping result to json 44071 1727204594.43105: done dumping result, returning 44071 1727204594.43113: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [127b8e07-fff9-c964-7471-00000000016e] 44071 1727204594.43117: sending task result for task 127b8e07-fff9-c964-7471-00000000016e 44071 1727204594.43227: done sending task result for task 127b8e07-fff9-c964-7471-00000000016e 44071 1727204594.43230: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 44071 1727204594.43292: no more pending results, returning what we have 44071 1727204594.43296: results queue empty 44071 1727204594.43296: checking for any_errors_fatal 44071 1727204594.43298: done checking for any_errors_fatal 44071 1727204594.43299: checking for max_fail_percentage 44071 1727204594.43300: done checking for max_fail_percentage 44071 1727204594.43301: checking to see if all hosts have failed and the running result is not ok 44071 1727204594.43301: done checking to see if all hosts have failed 44071 1727204594.43302: getting the remaining hosts for this loop 44071 1727204594.43304: done getting the remaining hosts for this loop 44071 1727204594.43309: getting the next task for host managed-node2 44071 1727204594.43318: done getting next task for host managed-node2 44071 1727204594.43321: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 44071 1727204594.43326: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204594.43329: getting variables 44071 1727204594.43331: in VariableManager get_vars() 44071 1727204594.43361: Calling all_inventory to load vars for managed-node2 44071 1727204594.43363: Calling groups_inventory to load vars for managed-node2 44071 1727204594.43369: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204594.43380: Calling all_plugins_play to load vars for managed-node2 44071 1727204594.43383: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204594.43385: Calling groups_plugins_play to load vars for managed-node2 44071 1727204594.43546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204594.43697: done with get_vars() 44071 1727204594.43707: done getting variables 44071 1727204594.43789: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 44071 1727204594.43889: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.386) 0:00:06.755 ***** 44071 1727204594.43915: entering _queue_task() for managed-node2/assert 44071 1727204594.43916: Creating lock for assert 44071 1727204594.44175: worker is 1 (out of 1 available) 44071 1727204594.44190: exiting _queue_task() for managed-node2/assert 44071 1727204594.44204: done queuing things up, now waiting for results queue to drain 44071 1727204594.44205: waiting for pending results... 44071 1727204594.44384: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' 44071 1727204594.44467: in run() - task 127b8e07-fff9-c964-7471-000000000120 44071 1727204594.44486: variable 'ansible_search_path' from source: unknown 44071 1727204594.44490: variable 'ansible_search_path' from source: unknown 44071 1727204594.44528: calling self._execute() 44071 1727204594.44593: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.44597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.44608: variable 'omit' from source: magic vars 44071 1727204594.44970: variable 'ansible_distribution_major_version' from source: facts 44071 1727204594.44982: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204594.44988: variable 'omit' from source: magic vars 44071 1727204594.45020: variable 'omit' from source: magic vars 44071 1727204594.45098: variable 'interface' from source: play vars 44071 1727204594.45112: variable 'omit' from source: magic vars 44071 1727204594.45148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204594.45182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204594.45202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204594.45218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204594.45228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204594.45253: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204594.45256: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.45259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.45335: Set connection var ansible_connection to ssh 44071 1727204594.45342: Set connection var ansible_timeout to 10 44071 1727204594.45347: Set connection var ansible_pipelining to False 44071 1727204594.45353: Set connection var ansible_shell_type to sh 44071 1727204594.45358: Set connection var ansible_shell_executable to /bin/sh 44071 1727204594.45366: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204594.45387: variable 'ansible_shell_executable' from source: unknown 44071 1727204594.45391: variable 'ansible_connection' from source: unknown 44071 1727204594.45395: variable 'ansible_module_compression' from source: unknown 44071 1727204594.45397: variable 'ansible_shell_type' from source: unknown 44071 1727204594.45400: variable 'ansible_shell_executable' from source: unknown 44071 1727204594.45402: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.45406: variable 'ansible_pipelining' from source: unknown 44071 1727204594.45409: variable 'ansible_timeout' from source: unknown 44071 1727204594.45411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.45528: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204594.45535: variable 'omit' from source: magic vars 44071 1727204594.45541: starting attempt loop 44071 1727204594.45544: running the handler 44071 1727204594.45655: variable 'interface_stat' from source: set_fact 44071 1727204594.45664: Evaluated conditional (not interface_stat.stat.exists): True 44071 1727204594.45671: handler run complete 44071 1727204594.45683: attempt loop complete, returning result 44071 1727204594.45686: _execute() done 44071 1727204594.45688: dumping result to json 44071 1727204594.45692: done dumping result, returning 44071 1727204594.45698: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' [127b8e07-fff9-c964-7471-000000000120] 44071 1727204594.45705: sending task result for task 127b8e07-fff9-c964-7471-000000000120 44071 1727204594.45800: done sending task result for task 127b8e07-fff9-c964-7471-000000000120 44071 1727204594.45802: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204594.45855: no more pending results, returning what we have 44071 1727204594.45858: results queue empty 44071 1727204594.45859: checking for any_errors_fatal 44071 1727204594.45871: done checking for any_errors_fatal 44071 1727204594.45872: checking for max_fail_percentage 44071 1727204594.45873: done checking for max_fail_percentage 44071 1727204594.45874: checking to see if all hosts have failed and the running result is not ok 44071 1727204594.45875: done checking to see if all hosts have failed 44071 1727204594.45875: getting the remaining hosts for this loop 44071 1727204594.45877: done getting the remaining hosts for this loop 44071 1727204594.45881: getting the next task for host managed-node2 44071 1727204594.45890: done getting next task for host managed-node2 44071 1727204594.45892: ^ task is: TASK: Test 44071 1727204594.45895: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204594.45898: getting variables 44071 1727204594.45900: in VariableManager get_vars() 44071 1727204594.45927: Calling all_inventory to load vars for managed-node2 44071 1727204594.45930: Calling groups_inventory to load vars for managed-node2 44071 1727204594.45933: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204594.45945: Calling all_plugins_play to load vars for managed-node2 44071 1727204594.45947: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204594.45951: Calling groups_plugins_play to load vars for managed-node2 44071 1727204594.46152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204594.46293: done with get_vars() 44071 1727204594.46303: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.024) 0:00:06.780 ***** 44071 1727204594.46374: entering _queue_task() for managed-node2/include_tasks 44071 1727204594.46607: worker is 1 (out of 1 available) 44071 1727204594.46621: exiting _queue_task() for managed-node2/include_tasks 44071 1727204594.46634: done queuing things up, now waiting for results queue to drain 44071 1727204594.46636: waiting for pending results... 44071 1727204594.46804: running TaskExecutor() for managed-node2/TASK: Test 44071 1727204594.46878: in run() - task 127b8e07-fff9-c964-7471-000000000095 44071 1727204594.46888: variable 'ansible_search_path' from source: unknown 44071 1727204594.46892: variable 'ansible_search_path' from source: unknown 44071 1727204594.46932: variable 'lsr_test' from source: include params 44071 1727204594.47103: variable 'lsr_test' from source: include params 44071 1727204594.47156: variable 'omit' from source: magic vars 44071 1727204594.47253: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.47262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.47272: variable 'omit' from source: magic vars 44071 1727204594.47456: variable 'ansible_distribution_major_version' from source: facts 44071 1727204594.47467: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204594.47473: variable 'item' from source: unknown 44071 1727204594.47526: variable 'item' from source: unknown 44071 1727204594.47552: variable 'item' from source: unknown 44071 1727204594.47599: variable 'item' from source: unknown 44071 1727204594.47732: dumping result to json 44071 1727204594.47735: done dumping result, returning 44071 1727204594.47740: done running TaskExecutor() for managed-node2/TASK: Test [127b8e07-fff9-c964-7471-000000000095] 44071 1727204594.47742: sending task result for task 127b8e07-fff9-c964-7471-000000000095 44071 1727204594.47783: done sending task result for task 127b8e07-fff9-c964-7471-000000000095 44071 1727204594.47786: WORKER PROCESS EXITING 44071 1727204594.47810: no more pending results, returning what we have 44071 1727204594.47815: in VariableManager get_vars() 44071 1727204594.47849: Calling all_inventory to load vars for managed-node2 44071 1727204594.47852: Calling groups_inventory to load vars for managed-node2 44071 1727204594.47855: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204594.47868: Calling all_plugins_play to load vars for managed-node2 44071 1727204594.47871: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204594.47874: Calling groups_plugins_play to load vars for managed-node2 44071 1727204594.48031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204594.48199: done with get_vars() 44071 1727204594.48205: variable 'ansible_search_path' from source: unknown 44071 1727204594.48206: variable 'ansible_search_path' from source: unknown 44071 1727204594.48234: we have included files to process 44071 1727204594.48235: generating all_blocks data 44071 1727204594.48236: done generating all_blocks data 44071 1727204594.48241: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 44071 1727204594.48242: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 44071 1727204594.48244: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 44071 1727204594.48483: done processing included file 44071 1727204594.48485: iterating over new_blocks loaded from include file 44071 1727204594.48486: in VariableManager get_vars() 44071 1727204594.48498: done with get_vars() 44071 1727204594.48499: filtering new block on tags 44071 1727204594.48523: done filtering new block on tags 44071 1727204594.48525: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed-node2 => (item=tasks/create_bridge_profile.yml) 44071 1727204594.48529: extending task lists for all hosts with included blocks 44071 1727204594.49099: done extending task lists 44071 1727204594.49100: done processing included files 44071 1727204594.49101: results queue empty 44071 1727204594.49101: checking for any_errors_fatal 44071 1727204594.49104: done checking for any_errors_fatal 44071 1727204594.49104: checking for max_fail_percentage 44071 1727204594.49105: done checking for max_fail_percentage 44071 1727204594.49106: checking to see if all hosts have failed and the running result is not ok 44071 1727204594.49106: done checking to see if all hosts have failed 44071 1727204594.49107: getting the remaining hosts for this loop 44071 1727204594.49108: done getting the remaining hosts for this loop 44071 1727204594.49109: getting the next task for host managed-node2 44071 1727204594.49113: done getting next task for host managed-node2 44071 1727204594.49114: ^ task is: TASK: Include network role 44071 1727204594.49116: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204594.49118: getting variables 44071 1727204594.49119: in VariableManager get_vars() 44071 1727204594.49127: Calling all_inventory to load vars for managed-node2 44071 1727204594.49129: Calling groups_inventory to load vars for managed-node2 44071 1727204594.49131: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204594.49140: Calling all_plugins_play to load vars for managed-node2 44071 1727204594.49143: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204594.49145: Calling groups_plugins_play to load vars for managed-node2 44071 1727204594.49278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204594.49415: done with get_vars() 44071 1727204594.49423: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.031) 0:00:06.811 ***** 44071 1727204594.49487: entering _queue_task() for managed-node2/include_role 44071 1727204594.49489: Creating lock for include_role 44071 1727204594.49760: worker is 1 (out of 1 available) 44071 1727204594.49775: exiting _queue_task() for managed-node2/include_role 44071 1727204594.49789: done queuing things up, now waiting for results queue to drain 44071 1727204594.49790: waiting for pending results... 44071 1727204594.49962: running TaskExecutor() for managed-node2/TASK: Include network role 44071 1727204594.50047: in run() - task 127b8e07-fff9-c964-7471-00000000018e 44071 1727204594.50060: variable 'ansible_search_path' from source: unknown 44071 1727204594.50064: variable 'ansible_search_path' from source: unknown 44071 1727204594.50103: calling self._execute() 44071 1727204594.50171: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.50241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.50245: variable 'omit' from source: magic vars 44071 1727204594.50498: variable 'ansible_distribution_major_version' from source: facts 44071 1727204594.50510: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204594.50517: _execute() done 44071 1727204594.50520: dumping result to json 44071 1727204594.50523: done dumping result, returning 44071 1727204594.50530: done running TaskExecutor() for managed-node2/TASK: Include network role [127b8e07-fff9-c964-7471-00000000018e] 44071 1727204594.50533: sending task result for task 127b8e07-fff9-c964-7471-00000000018e 44071 1727204594.50657: done sending task result for task 127b8e07-fff9-c964-7471-00000000018e 44071 1727204594.50660: WORKER PROCESS EXITING 44071 1727204594.50698: no more pending results, returning what we have 44071 1727204594.50703: in VariableManager get_vars() 44071 1727204594.50738: Calling all_inventory to load vars for managed-node2 44071 1727204594.50741: Calling groups_inventory to load vars for managed-node2 44071 1727204594.50745: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204594.50760: Calling all_plugins_play to load vars for managed-node2 44071 1727204594.50762: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204594.50767: Calling groups_plugins_play to load vars for managed-node2 44071 1727204594.50936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204594.51081: done with get_vars() 44071 1727204594.51089: variable 'ansible_search_path' from source: unknown 44071 1727204594.51091: variable 'ansible_search_path' from source: unknown 44071 1727204594.51224: variable 'omit' from source: magic vars 44071 1727204594.51254: variable 'omit' from source: magic vars 44071 1727204594.51264: variable 'omit' from source: magic vars 44071 1727204594.51269: we have included files to process 44071 1727204594.51269: generating all_blocks data 44071 1727204594.51271: done generating all_blocks data 44071 1727204594.51271: processing included file: fedora.linux_system_roles.network 44071 1727204594.51288: in VariableManager get_vars() 44071 1727204594.51298: done with get_vars() 44071 1727204594.51356: in VariableManager get_vars() 44071 1727204594.51370: done with get_vars() 44071 1727204594.51407: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44071 1727204594.51603: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44071 1727204594.51702: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44071 1727204594.52155: in VariableManager get_vars() 44071 1727204594.52173: done with get_vars() 44071 1727204594.52499: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204594.53715: iterating over new_blocks loaded from include file 44071 1727204594.53718: in VariableManager get_vars() 44071 1727204594.53731: done with get_vars() 44071 1727204594.53733: filtering new block on tags 44071 1727204594.53921: done filtering new block on tags 44071 1727204594.53925: in VariableManager get_vars() 44071 1727204594.53938: done with get_vars() 44071 1727204594.53940: filtering new block on tags 44071 1727204594.53952: done filtering new block on tags 44071 1727204594.53954: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 44071 1727204594.53958: extending task lists for all hosts with included blocks 44071 1727204594.54072: done extending task lists 44071 1727204594.54073: done processing included files 44071 1727204594.54074: results queue empty 44071 1727204594.54074: checking for any_errors_fatal 44071 1727204594.54077: done checking for any_errors_fatal 44071 1727204594.54078: checking for max_fail_percentage 44071 1727204594.54079: done checking for max_fail_percentage 44071 1727204594.54079: checking to see if all hosts have failed and the running result is not ok 44071 1727204594.54080: done checking to see if all hosts have failed 44071 1727204594.54080: getting the remaining hosts for this loop 44071 1727204594.54081: done getting the remaining hosts for this loop 44071 1727204594.54083: getting the next task for host managed-node2 44071 1727204594.54086: done getting next task for host managed-node2 44071 1727204594.54088: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204594.54090: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204594.54097: getting variables 44071 1727204594.54098: in VariableManager get_vars() 44071 1727204594.54107: Calling all_inventory to load vars for managed-node2 44071 1727204594.54109: Calling groups_inventory to load vars for managed-node2 44071 1727204594.54110: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204594.54115: Calling all_plugins_play to load vars for managed-node2 44071 1727204594.54116: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204594.54118: Calling groups_plugins_play to load vars for managed-node2 44071 1727204594.54242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204594.54384: done with get_vars() 44071 1727204594.54391: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.049) 0:00:06.861 ***** 44071 1727204594.54453: entering _queue_task() for managed-node2/include_tasks 44071 1727204594.54717: worker is 1 (out of 1 available) 44071 1727204594.54731: exiting _queue_task() for managed-node2/include_tasks 44071 1727204594.54748: done queuing things up, now waiting for results queue to drain 44071 1727204594.54750: waiting for pending results... 44071 1727204594.54923: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204594.55008: in run() - task 127b8e07-fff9-c964-7471-00000000020c 44071 1727204594.55022: variable 'ansible_search_path' from source: unknown 44071 1727204594.55026: variable 'ansible_search_path' from source: unknown 44071 1727204594.55057: calling self._execute() 44071 1727204594.55125: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.55130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.55141: variable 'omit' from source: magic vars 44071 1727204594.55458: variable 'ansible_distribution_major_version' from source: facts 44071 1727204594.55674: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204594.55679: _execute() done 44071 1727204594.55682: dumping result to json 44071 1727204594.55684: done dumping result, returning 44071 1727204594.55687: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-c964-7471-00000000020c] 44071 1727204594.55689: sending task result for task 127b8e07-fff9-c964-7471-00000000020c 44071 1727204594.55764: done sending task result for task 127b8e07-fff9-c964-7471-00000000020c 44071 1727204594.55770: WORKER PROCESS EXITING 44071 1727204594.55855: no more pending results, returning what we have 44071 1727204594.55859: in VariableManager get_vars() 44071 1727204594.55898: Calling all_inventory to load vars for managed-node2 44071 1727204594.55900: Calling groups_inventory to load vars for managed-node2 44071 1727204594.55902: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204594.55912: Calling all_plugins_play to load vars for managed-node2 44071 1727204594.55915: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204594.55918: Calling groups_plugins_play to load vars for managed-node2 44071 1727204594.56178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204594.56482: done with get_vars() 44071 1727204594.56497: variable 'ansible_search_path' from source: unknown 44071 1727204594.56498: variable 'ansible_search_path' from source: unknown 44071 1727204594.56545: we have included files to process 44071 1727204594.56547: generating all_blocks data 44071 1727204594.56549: done generating all_blocks data 44071 1727204594.56551: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204594.56553: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204594.56556: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204594.57445: done processing included file 44071 1727204594.57448: iterating over new_blocks loaded from include file 44071 1727204594.57450: in VariableManager get_vars() 44071 1727204594.57498: done with get_vars() 44071 1727204594.57500: filtering new block on tags 44071 1727204594.57541: done filtering new block on tags 44071 1727204594.57544: in VariableManager get_vars() 44071 1727204594.57573: done with get_vars() 44071 1727204594.57575: filtering new block on tags 44071 1727204594.57650: done filtering new block on tags 44071 1727204594.57653: in VariableManager get_vars() 44071 1727204594.57681: done with get_vars() 44071 1727204594.57684: filtering new block on tags 44071 1727204594.57746: done filtering new block on tags 44071 1727204594.57749: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 44071 1727204594.57755: extending task lists for all hosts with included blocks 44071 1727204594.60403: done extending task lists 44071 1727204594.60406: done processing included files 44071 1727204594.60407: results queue empty 44071 1727204594.60407: checking for any_errors_fatal 44071 1727204594.60411: done checking for any_errors_fatal 44071 1727204594.60412: checking for max_fail_percentage 44071 1727204594.60413: done checking for max_fail_percentage 44071 1727204594.60413: checking to see if all hosts have failed and the running result is not ok 44071 1727204594.60414: done checking to see if all hosts have failed 44071 1727204594.60415: getting the remaining hosts for this loop 44071 1727204594.60416: done getting the remaining hosts for this loop 44071 1727204594.60420: getting the next task for host managed-node2 44071 1727204594.60426: done getting next task for host managed-node2 44071 1727204594.60429: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204594.60434: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204594.60447: getting variables 44071 1727204594.60448: in VariableManager get_vars() 44071 1727204594.60475: Calling all_inventory to load vars for managed-node2 44071 1727204594.60478: Calling groups_inventory to load vars for managed-node2 44071 1727204594.60480: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204594.60487: Calling all_plugins_play to load vars for managed-node2 44071 1727204594.60490: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204594.60493: Calling groups_plugins_play to load vars for managed-node2 44071 1727204594.60772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204594.60987: done with get_vars() 44071 1727204594.60997: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.066) 0:00:06.927 ***** 44071 1727204594.61062: entering _queue_task() for managed-node2/setup 44071 1727204594.61323: worker is 1 (out of 1 available) 44071 1727204594.61339: exiting _queue_task() for managed-node2/setup 44071 1727204594.61352: done queuing things up, now waiting for results queue to drain 44071 1727204594.61354: waiting for pending results... 44071 1727204594.61538: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204594.61640: in run() - task 127b8e07-fff9-c964-7471-000000000269 44071 1727204594.61656: variable 'ansible_search_path' from source: unknown 44071 1727204594.61659: variable 'ansible_search_path' from source: unknown 44071 1727204594.61694: calling self._execute() 44071 1727204594.61761: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.61768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.61777: variable 'omit' from source: magic vars 44071 1727204594.62296: variable 'ansible_distribution_major_version' from source: facts 44071 1727204594.62306: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204594.62477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204594.65039: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204594.65113: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204594.65254: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204594.65258: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204594.65261: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204594.65344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204594.65390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204594.65422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204594.65480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204594.65500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204594.65562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204594.65601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204594.65631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204594.65683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204594.65706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204594.65899: variable '__network_required_facts' from source: role '' defaults 44071 1727204594.65909: variable 'ansible_facts' from source: unknown 44071 1727204594.66070: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44071 1727204594.66074: when evaluation is False, skipping this task 44071 1727204594.66077: _execute() done 44071 1727204594.66079: dumping result to json 44071 1727204594.66081: done dumping result, returning 44071 1727204594.66083: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-c964-7471-000000000269] 44071 1727204594.66086: sending task result for task 127b8e07-fff9-c964-7471-000000000269 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204594.66328: no more pending results, returning what we have 44071 1727204594.66332: results queue empty 44071 1727204594.66333: checking for any_errors_fatal 44071 1727204594.66335: done checking for any_errors_fatal 44071 1727204594.66336: checking for max_fail_percentage 44071 1727204594.66339: done checking for max_fail_percentage 44071 1727204594.66340: checking to see if all hosts have failed and the running result is not ok 44071 1727204594.66340: done checking to see if all hosts have failed 44071 1727204594.66341: getting the remaining hosts for this loop 44071 1727204594.66343: done getting the remaining hosts for this loop 44071 1727204594.66348: getting the next task for host managed-node2 44071 1727204594.66359: done getting next task for host managed-node2 44071 1727204594.66363: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204594.66371: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204594.66387: getting variables 44071 1727204594.66389: in VariableManager get_vars() 44071 1727204594.66428: Calling all_inventory to load vars for managed-node2 44071 1727204594.66431: Calling groups_inventory to load vars for managed-node2 44071 1727204594.66434: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204594.66445: Calling all_plugins_play to load vars for managed-node2 44071 1727204594.66448: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204594.66451: Calling groups_plugins_play to load vars for managed-node2 44071 1727204594.67063: done sending task result for task 127b8e07-fff9-c964-7471-000000000269 44071 1727204594.67075: WORKER PROCESS EXITING 44071 1727204594.67207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204594.67471: done with get_vars() 44071 1727204594.67484: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.065) 0:00:06.992 ***** 44071 1727204594.67598: entering _queue_task() for managed-node2/stat 44071 1727204594.67948: worker is 1 (out of 1 available) 44071 1727204594.67962: exiting _queue_task() for managed-node2/stat 44071 1727204594.68107: done queuing things up, now waiting for results queue to drain 44071 1727204594.68109: waiting for pending results... 44071 1727204594.68309: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204594.68524: in run() - task 127b8e07-fff9-c964-7471-00000000026b 44071 1727204594.68562: variable 'ansible_search_path' from source: unknown 44071 1727204594.68575: variable 'ansible_search_path' from source: unknown 44071 1727204594.68625: calling self._execute() 44071 1727204594.68731: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.68745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.68770: variable 'omit' from source: magic vars 44071 1727204594.69241: variable 'ansible_distribution_major_version' from source: facts 44071 1727204594.69262: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204594.69460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204594.69847: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204594.69852: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204594.69870: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204594.69954: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204594.70064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204594.70104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204594.70140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204594.70186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204594.70308: variable '__network_is_ostree' from source: set_fact 44071 1727204594.70322: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204594.70330: when evaluation is False, skipping this task 44071 1727204594.70392: _execute() done 44071 1727204594.70395: dumping result to json 44071 1727204594.70399: done dumping result, returning 44071 1727204594.70402: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-c964-7471-00000000026b] 44071 1727204594.70404: sending task result for task 127b8e07-fff9-c964-7471-00000000026b 44071 1727204594.70480: done sending task result for task 127b8e07-fff9-c964-7471-00000000026b 44071 1727204594.70483: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204594.70542: no more pending results, returning what we have 44071 1727204594.70546: results queue empty 44071 1727204594.70547: checking for any_errors_fatal 44071 1727204594.70557: done checking for any_errors_fatal 44071 1727204594.70558: checking for max_fail_percentage 44071 1727204594.70560: done checking for max_fail_percentage 44071 1727204594.70561: checking to see if all hosts have failed and the running result is not ok 44071 1727204594.70562: done checking to see if all hosts have failed 44071 1727204594.70562: getting the remaining hosts for this loop 44071 1727204594.70566: done getting the remaining hosts for this loop 44071 1727204594.70572: getting the next task for host managed-node2 44071 1727204594.70581: done getting next task for host managed-node2 44071 1727204594.70585: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204594.70591: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204594.70609: getting variables 44071 1727204594.70611: in VariableManager get_vars() 44071 1727204594.70653: Calling all_inventory to load vars for managed-node2 44071 1727204594.70656: Calling groups_inventory to load vars for managed-node2 44071 1727204594.70658: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204594.70855: Calling all_plugins_play to load vars for managed-node2 44071 1727204594.70859: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204594.70864: Calling groups_plugins_play to load vars for managed-node2 44071 1727204594.71191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204594.71484: done with get_vars() 44071 1727204594.71498: done getting variables 44071 1727204594.71571: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.040) 0:00:07.032 ***** 44071 1727204594.71613: entering _queue_task() for managed-node2/set_fact 44071 1727204594.72010: worker is 1 (out of 1 available) 44071 1727204594.72026: exiting _queue_task() for managed-node2/set_fact 44071 1727204594.72040: done queuing things up, now waiting for results queue to drain 44071 1727204594.72041: waiting for pending results... 44071 1727204594.72385: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204594.72522: in run() - task 127b8e07-fff9-c964-7471-00000000026c 44071 1727204594.72548: variable 'ansible_search_path' from source: unknown 44071 1727204594.72557: variable 'ansible_search_path' from source: unknown 44071 1727204594.72605: calling self._execute() 44071 1727204594.72707: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.72723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.72847: variable 'omit' from source: magic vars 44071 1727204594.73178: variable 'ansible_distribution_major_version' from source: facts 44071 1727204594.73199: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204594.73418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204594.73762: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204594.73830: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204594.73879: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204594.73922: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204594.74029: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204594.74072: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204594.74105: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204594.74136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204594.74254: variable '__network_is_ostree' from source: set_fact 44071 1727204594.74278: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204594.74286: when evaluation is False, skipping this task 44071 1727204594.74293: _execute() done 44071 1727204594.74376: dumping result to json 44071 1727204594.74381: done dumping result, returning 44071 1727204594.74385: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-00000000026c] 44071 1727204594.74388: sending task result for task 127b8e07-fff9-c964-7471-00000000026c 44071 1727204594.74463: done sending task result for task 127b8e07-fff9-c964-7471-00000000026c 44071 1727204594.74468: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204594.74522: no more pending results, returning what we have 44071 1727204594.74526: results queue empty 44071 1727204594.74527: checking for any_errors_fatal 44071 1727204594.74536: done checking for any_errors_fatal 44071 1727204594.74537: checking for max_fail_percentage 44071 1727204594.74538: done checking for max_fail_percentage 44071 1727204594.74539: checking to see if all hosts have failed and the running result is not ok 44071 1727204594.74541: done checking to see if all hosts have failed 44071 1727204594.74541: getting the remaining hosts for this loop 44071 1727204594.74543: done getting the remaining hosts for this loop 44071 1727204594.74548: getting the next task for host managed-node2 44071 1727204594.74560: done getting next task for host managed-node2 44071 1727204594.74564: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204594.74576: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204594.74594: getting variables 44071 1727204594.74595: in VariableManager get_vars() 44071 1727204594.74636: Calling all_inventory to load vars for managed-node2 44071 1727204594.74639: Calling groups_inventory to load vars for managed-node2 44071 1727204594.74641: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204594.74654: Calling all_plugins_play to load vars for managed-node2 44071 1727204594.74657: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204594.74661: Calling groups_plugins_play to load vars for managed-node2 44071 1727204594.75167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204594.75453: done with get_vars() 44071 1727204594.75469: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:14 -0400 (0:00:00.039) 0:00:07.072 ***** 44071 1727204594.75577: entering _queue_task() for managed-node2/service_facts 44071 1727204594.75579: Creating lock for service_facts 44071 1727204594.76097: worker is 1 (out of 1 available) 44071 1727204594.76108: exiting _queue_task() for managed-node2/service_facts 44071 1727204594.76120: done queuing things up, now waiting for results queue to drain 44071 1727204594.76122: waiting for pending results... 44071 1727204594.76388: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204594.76395: in run() - task 127b8e07-fff9-c964-7471-00000000026e 44071 1727204594.76415: variable 'ansible_search_path' from source: unknown 44071 1727204594.76423: variable 'ansible_search_path' from source: unknown 44071 1727204594.76470: calling self._execute() 44071 1727204594.76590: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.76599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.76605: variable 'omit' from source: magic vars 44071 1727204594.77016: variable 'ansible_distribution_major_version' from source: facts 44071 1727204594.77137: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204594.77143: variable 'omit' from source: magic vars 44071 1727204594.77152: variable 'omit' from source: magic vars 44071 1727204594.77194: variable 'omit' from source: magic vars 44071 1727204594.77249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204594.77299: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204594.77328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204594.77356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204594.77382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204594.77418: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204594.77428: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.77437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.77553: Set connection var ansible_connection to ssh 44071 1727204594.77563: Set connection var ansible_timeout to 10 44071 1727204594.77580: Set connection var ansible_pipelining to False 44071 1727204594.77589: Set connection var ansible_shell_type to sh 44071 1727204594.77597: Set connection var ansible_shell_executable to /bin/sh 44071 1727204594.77607: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204594.77632: variable 'ansible_shell_executable' from source: unknown 44071 1727204594.77683: variable 'ansible_connection' from source: unknown 44071 1727204594.77686: variable 'ansible_module_compression' from source: unknown 44071 1727204594.77693: variable 'ansible_shell_type' from source: unknown 44071 1727204594.77695: variable 'ansible_shell_executable' from source: unknown 44071 1727204594.77697: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204594.77699: variable 'ansible_pipelining' from source: unknown 44071 1727204594.77701: variable 'ansible_timeout' from source: unknown 44071 1727204594.77703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204594.77923: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204594.77945: variable 'omit' from source: magic vars 44071 1727204594.77955: starting attempt loop 44071 1727204594.77972: running the handler 44071 1727204594.78010: _low_level_execute_command(): starting 44071 1727204594.78013: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204594.78859: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204594.78888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204594.78909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204594.79011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204594.79044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204594.79064: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204594.79081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204594.79195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204594.80958: stdout chunk (state=3): >>>/root <<< 44071 1727204594.81089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204594.81145: stderr chunk (state=3): >>><<< 44071 1727204594.81148: stdout chunk (state=3): >>><<< 44071 1727204594.81163: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204594.81187: _low_level_execute_command(): starting 44071 1727204594.81194: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204594.8117287-44495-33555274612204 `" && echo ansible-tmp-1727204594.8117287-44495-33555274612204="` echo /root/.ansible/tmp/ansible-tmp-1727204594.8117287-44495-33555274612204 `" ) && sleep 0' 44071 1727204594.81711: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204594.81715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204594.81727: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204594.81729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204594.81777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204594.81782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204594.81791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204594.81873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204594.83846: stdout chunk (state=3): >>>ansible-tmp-1727204594.8117287-44495-33555274612204=/root/.ansible/tmp/ansible-tmp-1727204594.8117287-44495-33555274612204 <<< 44071 1727204594.83971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204594.84031: stderr chunk (state=3): >>><<< 44071 1727204594.84035: stdout chunk (state=3): >>><<< 44071 1727204594.84053: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204594.8117287-44495-33555274612204=/root/.ansible/tmp/ansible-tmp-1727204594.8117287-44495-33555274612204 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204594.84102: variable 'ansible_module_compression' from source: unknown 44071 1727204594.84139: ANSIBALLZ: Using lock for service_facts 44071 1727204594.84142: ANSIBALLZ: Acquiring lock 44071 1727204594.84146: ANSIBALLZ: Lock acquired: 140077511368416 44071 1727204594.84151: ANSIBALLZ: Creating module 44071 1727204594.96555: ANSIBALLZ: Writing module into payload 44071 1727204594.96628: ANSIBALLZ: Writing module 44071 1727204594.96650: ANSIBALLZ: Renaming module 44071 1727204594.96654: ANSIBALLZ: Done creating module 44071 1727204594.96674: variable 'ansible_facts' from source: unknown 44071 1727204594.96721: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204594.8117287-44495-33555274612204/AnsiballZ_service_facts.py 44071 1727204594.96840: Sending initial data 44071 1727204594.96844: Sent initial data (161 bytes) 44071 1727204594.97364: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204594.97370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204594.97373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204594.97375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204594.97378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204594.97429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204594.97432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204594.97435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204594.97519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204594.99144: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204594.99209: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204594.99283: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpmze8523t /root/.ansible/tmp/ansible-tmp-1727204594.8117287-44495-33555274612204/AnsiballZ_service_facts.py <<< 44071 1727204594.99286: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204594.8117287-44495-33555274612204/AnsiballZ_service_facts.py" <<< 44071 1727204594.99351: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpmze8523t" to remote "/root/.ansible/tmp/ansible-tmp-1727204594.8117287-44495-33555274612204/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204594.8117287-44495-33555274612204/AnsiballZ_service_facts.py" <<< 44071 1727204595.00049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204595.00127: stderr chunk (state=3): >>><<< 44071 1727204595.00131: stdout chunk (state=3): >>><<< 44071 1727204595.00153: done transferring module to remote 44071 1727204595.00165: _low_level_execute_command(): starting 44071 1727204595.00173: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204594.8117287-44495-33555274612204/ /root/.ansible/tmp/ansible-tmp-1727204594.8117287-44495-33555274612204/AnsiballZ_service_facts.py && sleep 0' 44071 1727204595.00663: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204595.00670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204595.00702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204595.00710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204595.00712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204595.00715: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204595.00773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204595.00776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204595.00779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204595.00858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204595.02692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204595.02751: stderr chunk (state=3): >>><<< 44071 1727204595.02755: stdout chunk (state=3): >>><<< 44071 1727204595.02770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204595.02776: _low_level_execute_command(): starting 44071 1727204595.02782: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204594.8117287-44495-33555274612204/AnsiballZ_service_facts.py && sleep 0' 44071 1727204595.03296: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204595.03300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204595.03303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204595.03305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204595.03364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204595.03380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204595.03383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204595.03454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204597.25121: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev<<< 44071 1727204597.25145: stdout chunk (state=3): >>>-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44071 1727204597.26800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204597.26825: stderr chunk (state=3): >>><<< 44071 1727204597.26842: stdout chunk (state=3): >>><<< 44071 1727204597.26880: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204597.27857: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204594.8117287-44495-33555274612204/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204597.27877: _low_level_execute_command(): starting 44071 1727204597.27886: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204594.8117287-44495-33555274612204/ > /dev/null 2>&1 && sleep 0' 44071 1727204597.28633: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204597.28656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204597.28675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204597.28716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204597.28731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204597.28840: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204597.28865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204597.28986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204597.30892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204597.30994: stderr chunk (state=3): >>><<< 44071 1727204597.31003: stdout chunk (state=3): >>><<< 44071 1727204597.31024: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204597.31036: handler run complete 44071 1727204597.31271: variable 'ansible_facts' from source: unknown 44071 1727204597.31571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204597.32111: variable 'ansible_facts' from source: unknown 44071 1727204597.32295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204597.32664: attempt loop complete, returning result 44071 1727204597.32669: _execute() done 44071 1727204597.32671: dumping result to json 44071 1727204597.32721: done dumping result, returning 44071 1727204597.32737: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-c964-7471-00000000026e] 44071 1727204597.32747: sending task result for task 127b8e07-fff9-c964-7471-00000000026e 44071 1727204597.34730: done sending task result for task 127b8e07-fff9-c964-7471-00000000026e 44071 1727204597.34734: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204597.34842: no more pending results, returning what we have 44071 1727204597.34845: results queue empty 44071 1727204597.34846: checking for any_errors_fatal 44071 1727204597.34850: done checking for any_errors_fatal 44071 1727204597.34851: checking for max_fail_percentage 44071 1727204597.34852: done checking for max_fail_percentage 44071 1727204597.34853: checking to see if all hosts have failed and the running result is not ok 44071 1727204597.34854: done checking to see if all hosts have failed 44071 1727204597.34854: getting the remaining hosts for this loop 44071 1727204597.34856: done getting the remaining hosts for this loop 44071 1727204597.34860: getting the next task for host managed-node2 44071 1727204597.34868: done getting next task for host managed-node2 44071 1727204597.34872: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204597.34879: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204597.34889: getting variables 44071 1727204597.34891: in VariableManager get_vars() 44071 1727204597.34918: Calling all_inventory to load vars for managed-node2 44071 1727204597.34921: Calling groups_inventory to load vars for managed-node2 44071 1727204597.34924: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204597.34933: Calling all_plugins_play to load vars for managed-node2 44071 1727204597.34936: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204597.34939: Calling groups_plugins_play to load vars for managed-node2 44071 1727204597.35598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204597.36197: done with get_vars() 44071 1727204597.36219: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:17 -0400 (0:00:02.607) 0:00:09.679 ***** 44071 1727204597.36335: entering _queue_task() for managed-node2/package_facts 44071 1727204597.36337: Creating lock for package_facts 44071 1727204597.36809: worker is 1 (out of 1 available) 44071 1727204597.36825: exiting _queue_task() for managed-node2/package_facts 44071 1727204597.36839: done queuing things up, now waiting for results queue to drain 44071 1727204597.36841: waiting for pending results... 44071 1727204597.37186: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204597.37253: in run() - task 127b8e07-fff9-c964-7471-00000000026f 44071 1727204597.37290: variable 'ansible_search_path' from source: unknown 44071 1727204597.37300: variable 'ansible_search_path' from source: unknown 44071 1727204597.37346: calling self._execute() 44071 1727204597.37611: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204597.37629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204597.37644: variable 'omit' from source: magic vars 44071 1727204597.38200: variable 'ansible_distribution_major_version' from source: facts 44071 1727204597.38221: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204597.38232: variable 'omit' from source: magic vars 44071 1727204597.38322: variable 'omit' from source: magic vars 44071 1727204597.38369: variable 'omit' from source: magic vars 44071 1727204597.38428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204597.38476: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204597.38506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204597.38535: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204597.38554: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204597.38592: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204597.38600: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204597.38606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204597.38717: Set connection var ansible_connection to ssh 44071 1727204597.38730: Set connection var ansible_timeout to 10 44071 1727204597.38746: Set connection var ansible_pipelining to False 44071 1727204597.38757: Set connection var ansible_shell_type to sh 44071 1727204597.38770: Set connection var ansible_shell_executable to /bin/sh 44071 1727204597.38784: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204597.38813: variable 'ansible_shell_executable' from source: unknown 44071 1727204597.38823: variable 'ansible_connection' from source: unknown 44071 1727204597.38832: variable 'ansible_module_compression' from source: unknown 44071 1727204597.38840: variable 'ansible_shell_type' from source: unknown 44071 1727204597.38852: variable 'ansible_shell_executable' from source: unknown 44071 1727204597.38860: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204597.38870: variable 'ansible_pipelining' from source: unknown 44071 1727204597.38877: variable 'ansible_timeout' from source: unknown 44071 1727204597.38883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204597.39108: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204597.39171: variable 'omit' from source: magic vars 44071 1727204597.39175: starting attempt loop 44071 1727204597.39182: running the handler 44071 1727204597.39184: _low_level_execute_command(): starting 44071 1727204597.39186: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204597.39971: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204597.40077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204597.40117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204597.40138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204597.40165: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204597.40295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204597.41958: stdout chunk (state=3): >>>/root <<< 44071 1727204597.42222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204597.42226: stdout chunk (state=3): >>><<< 44071 1727204597.42229: stderr chunk (state=3): >>><<< 44071 1727204597.42247: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204597.42268: _low_level_execute_command(): starting 44071 1727204597.42404: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204597.4225385-44580-219776871851941 `" && echo ansible-tmp-1727204597.4225385-44580-219776871851941="` echo /root/.ansible/tmp/ansible-tmp-1727204597.4225385-44580-219776871851941 `" ) && sleep 0' 44071 1727204597.43585: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204597.43683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204597.43736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204597.43820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204597.43839: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204597.44089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204597.46070: stdout chunk (state=3): >>>ansible-tmp-1727204597.4225385-44580-219776871851941=/root/.ansible/tmp/ansible-tmp-1727204597.4225385-44580-219776871851941 <<< 44071 1727204597.46270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204597.46300: stderr chunk (state=3): >>><<< 44071 1727204597.46304: stdout chunk (state=3): >>><<< 44071 1727204597.46328: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204597.4225385-44580-219776871851941=/root/.ansible/tmp/ansible-tmp-1727204597.4225385-44580-219776871851941 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204597.46674: variable 'ansible_module_compression' from source: unknown 44071 1727204597.46678: ANSIBALLZ: Using lock for package_facts 44071 1727204597.46681: ANSIBALLZ: Acquiring lock 44071 1727204597.46684: ANSIBALLZ: Lock acquired: 140077512061392 44071 1727204597.46686: ANSIBALLZ: Creating module 44071 1727204598.01230: ANSIBALLZ: Writing module into payload 44071 1727204598.01521: ANSIBALLZ: Writing module 44071 1727204598.01571: ANSIBALLZ: Renaming module 44071 1727204598.01651: ANSIBALLZ: Done creating module 44071 1727204598.01703: variable 'ansible_facts' from source: unknown 44071 1727204598.02442: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204597.4225385-44580-219776871851941/AnsiballZ_package_facts.py 44071 1727204598.02609: Sending initial data 44071 1727204598.02621: Sent initial data (162 bytes) 44071 1727204598.03898: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204598.03983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204598.04143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204598.04233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204598.05997: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204598.06062: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204598.06173: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp34a8mz1c /root/.ansible/tmp/ansible-tmp-1727204597.4225385-44580-219776871851941/AnsiballZ_package_facts.py <<< 44071 1727204598.06185: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204597.4225385-44580-219776871851941/AnsiballZ_package_facts.py" <<< 44071 1727204598.06324: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp34a8mz1c" to remote "/root/.ansible/tmp/ansible-tmp-1727204597.4225385-44580-219776871851941/AnsiballZ_package_facts.py" <<< 44071 1727204598.06340: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204597.4225385-44580-219776871851941/AnsiballZ_package_facts.py" <<< 44071 1727204598.09288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204598.09390: stderr chunk (state=3): >>><<< 44071 1727204598.09394: stdout chunk (state=3): >>><<< 44071 1727204598.09425: done transferring module to remote 44071 1727204598.09441: _low_level_execute_command(): starting 44071 1727204598.09444: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204597.4225385-44580-219776871851941/ /root/.ansible/tmp/ansible-tmp-1727204597.4225385-44580-219776871851941/AnsiballZ_package_facts.py && sleep 0' 44071 1727204598.11175: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204598.11313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204598.11326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204598.11603: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204598.11608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204598.11611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204598.11622: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204598.11993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204598.12240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204598.14067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204598.14080: stderr chunk (state=3): >>><<< 44071 1727204598.14083: stdout chunk (state=3): >>><<< 44071 1727204598.14099: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204598.14103: _low_level_execute_command(): starting 44071 1727204598.14109: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204597.4225385-44580-219776871851941/AnsiballZ_package_facts.py && sleep 0' 44071 1727204598.15519: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204598.15897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204598.15919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204598.16212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204598.78677: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 44071 1727204598.78699: stdout chunk (state=3): >>>me": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 44071 1727204598.78723: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "li<<< 44071 1727204598.78740: stdout chunk (state=3): >>>breport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-l<<< 44071 1727204598.78774: stdout chunk (state=3): >>>ibs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lib<<< 44071 1727204598.78786: stdout chunk (state=3): >>>xmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_<<< 44071 1727204598.78839: stdout chunk (state=3): >>>64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "<<< 44071 1727204598.78858: stdout chunk (state=3): >>>rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}]<<< 44071 1727204598.78872: stdout chunk (state=3): >>>, "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 44071 1727204598.78880: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "s<<< 44071 1727204598.78902: stdout chunk (state=3): >>>ource": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44071 1727204598.80808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204598.80871: stderr chunk (state=3): >>><<< 44071 1727204598.80875: stdout chunk (state=3): >>><<< 44071 1727204598.80922: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204598.83386: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204597.4225385-44580-219776871851941/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204598.83429: _low_level_execute_command(): starting 44071 1727204598.83440: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204597.4225385-44580-219776871851941/ > /dev/null 2>&1 && sleep 0' 44071 1727204598.84074: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204598.84078: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204598.84081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204598.84119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204598.84123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204598.84126: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204598.84128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204598.84131: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204598.84134: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204598.84143: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204598.84185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204598.84188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204598.84191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204598.84194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204598.84196: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204598.84199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204598.84270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204598.84277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204598.84297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204598.84396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204598.86403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204598.86408: stderr chunk (state=3): >>><<< 44071 1727204598.86411: stdout chunk (state=3): >>><<< 44071 1727204598.86419: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204598.86427: handler run complete 44071 1727204598.87721: variable 'ansible_facts' from source: unknown 44071 1727204598.88219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204598.90918: variable 'ansible_facts' from source: unknown 44071 1727204598.91489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204598.92506: attempt loop complete, returning result 44071 1727204598.92530: _execute() done 44071 1727204598.92533: dumping result to json 44071 1727204598.92815: done dumping result, returning 44071 1727204598.92871: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-c964-7471-00000000026f] 44071 1727204598.92874: sending task result for task 127b8e07-fff9-c964-7471-00000000026f 44071 1727204599.01425: done sending task result for task 127b8e07-fff9-c964-7471-00000000026f 44071 1727204599.01429: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204599.01532: no more pending results, returning what we have 44071 1727204599.01534: results queue empty 44071 1727204599.01535: checking for any_errors_fatal 44071 1727204599.01540: done checking for any_errors_fatal 44071 1727204599.01541: checking for max_fail_percentage 44071 1727204599.01543: done checking for max_fail_percentage 44071 1727204599.01543: checking to see if all hosts have failed and the running result is not ok 44071 1727204599.01544: done checking to see if all hosts have failed 44071 1727204599.01545: getting the remaining hosts for this loop 44071 1727204599.01546: done getting the remaining hosts for this loop 44071 1727204599.01550: getting the next task for host managed-node2 44071 1727204599.01559: done getting next task for host managed-node2 44071 1727204599.01562: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204599.01572: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204599.01581: getting variables 44071 1727204599.01583: in VariableManager get_vars() 44071 1727204599.01616: Calling all_inventory to load vars for managed-node2 44071 1727204599.01625: Calling groups_inventory to load vars for managed-node2 44071 1727204599.01628: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204599.01638: Calling all_plugins_play to load vars for managed-node2 44071 1727204599.01641: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204599.01645: Calling groups_plugins_play to load vars for managed-node2 44071 1727204599.03389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204599.07330: done with get_vars() 44071 1727204599.07369: done getting variables 44071 1727204599.07440: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:19 -0400 (0:00:01.713) 0:00:11.393 ***** 44071 1727204599.07684: entering _queue_task() for managed-node2/debug 44071 1727204599.08264: worker is 1 (out of 1 available) 44071 1727204599.08484: exiting _queue_task() for managed-node2/debug 44071 1727204599.08498: done queuing things up, now waiting for results queue to drain 44071 1727204599.08500: waiting for pending results... 44071 1727204599.08909: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204599.09373: in run() - task 127b8e07-fff9-c964-7471-00000000020d 44071 1727204599.09378: variable 'ansible_search_path' from source: unknown 44071 1727204599.09380: variable 'ansible_search_path' from source: unknown 44071 1727204599.09383: calling self._execute() 44071 1727204599.09773: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204599.09777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204599.09780: variable 'omit' from source: magic vars 44071 1727204599.10573: variable 'ansible_distribution_major_version' from source: facts 44071 1727204599.10580: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204599.10583: variable 'omit' from source: magic vars 44071 1727204599.10587: variable 'omit' from source: magic vars 44071 1727204599.10814: variable 'network_provider' from source: set_fact 44071 1727204599.10834: variable 'omit' from source: magic vars 44071 1727204599.10879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204599.10923: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204599.11200: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204599.11272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204599.11276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204599.11282: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204599.11290: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204599.11298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204599.11410: Set connection var ansible_connection to ssh 44071 1727204599.11486: Set connection var ansible_timeout to 10 44071 1727204599.11495: Set connection var ansible_pipelining to False 44071 1727204599.11503: Set connection var ansible_shell_type to sh 44071 1727204599.11511: Set connection var ansible_shell_executable to /bin/sh 44071 1727204599.11520: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204599.11549: variable 'ansible_shell_executable' from source: unknown 44071 1727204599.11556: variable 'ansible_connection' from source: unknown 44071 1727204599.11772: variable 'ansible_module_compression' from source: unknown 44071 1727204599.11775: variable 'ansible_shell_type' from source: unknown 44071 1727204599.11778: variable 'ansible_shell_executable' from source: unknown 44071 1727204599.11781: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204599.11783: variable 'ansible_pipelining' from source: unknown 44071 1727204599.11785: variable 'ansible_timeout' from source: unknown 44071 1727204599.11788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204599.11886: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204599.12090: variable 'omit' from source: magic vars 44071 1727204599.12103: starting attempt loop 44071 1727204599.12110: running the handler 44071 1727204599.12167: handler run complete 44071 1727204599.12190: attempt loop complete, returning result 44071 1727204599.12277: _execute() done 44071 1727204599.12286: dumping result to json 44071 1727204599.12294: done dumping result, returning 44071 1727204599.12308: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-c964-7471-00000000020d] 44071 1727204599.12319: sending task result for task 127b8e07-fff9-c964-7471-00000000020d ok: [managed-node2] => {} MSG: Using network provider: nm 44071 1727204599.12505: no more pending results, returning what we have 44071 1727204599.12508: results queue empty 44071 1727204599.12509: checking for any_errors_fatal 44071 1727204599.12520: done checking for any_errors_fatal 44071 1727204599.12521: checking for max_fail_percentage 44071 1727204599.12523: done checking for max_fail_percentage 44071 1727204599.12523: checking to see if all hosts have failed and the running result is not ok 44071 1727204599.12524: done checking to see if all hosts have failed 44071 1727204599.12525: getting the remaining hosts for this loop 44071 1727204599.12526: done getting the remaining hosts for this loop 44071 1727204599.12533: getting the next task for host managed-node2 44071 1727204599.12544: done getting next task for host managed-node2 44071 1727204599.12548: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204599.12555: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204599.12567: getting variables 44071 1727204599.12570: in VariableManager get_vars() 44071 1727204599.12609: Calling all_inventory to load vars for managed-node2 44071 1727204599.12612: Calling groups_inventory to load vars for managed-node2 44071 1727204599.12614: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204599.12626: Calling all_plugins_play to load vars for managed-node2 44071 1727204599.12629: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204599.12632: Calling groups_plugins_play to load vars for managed-node2 44071 1727204599.13182: done sending task result for task 127b8e07-fff9-c964-7471-00000000020d 44071 1727204599.13186: WORKER PROCESS EXITING 44071 1727204599.16371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204599.21307: done with get_vars() 44071 1727204599.21358: done getting variables 44071 1727204599.21686: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.140) 0:00:11.533 ***** 44071 1727204599.21741: entering _queue_task() for managed-node2/fail 44071 1727204599.21743: Creating lock for fail 44071 1727204599.22560: worker is 1 (out of 1 available) 44071 1727204599.22699: exiting _queue_task() for managed-node2/fail 44071 1727204599.22713: done queuing things up, now waiting for results queue to drain 44071 1727204599.22715: waiting for pending results... 44071 1727204599.23103: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204599.23456: in run() - task 127b8e07-fff9-c964-7471-00000000020e 44071 1727204599.23509: variable 'ansible_search_path' from source: unknown 44071 1727204599.23523: variable 'ansible_search_path' from source: unknown 44071 1727204599.23579: calling self._execute() 44071 1727204599.23681: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204599.23700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204599.23716: variable 'omit' from source: magic vars 44071 1727204599.24126: variable 'ansible_distribution_major_version' from source: facts 44071 1727204599.24146: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204599.24310: variable 'network_state' from source: role '' defaults 44071 1727204599.24313: Evaluated conditional (network_state != {}): False 44071 1727204599.24316: when evaluation is False, skipping this task 44071 1727204599.24319: _execute() done 44071 1727204599.24327: dumping result to json 44071 1727204599.24419: done dumping result, returning 44071 1727204599.24424: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-c964-7471-00000000020e] 44071 1727204599.24427: sending task result for task 127b8e07-fff9-c964-7471-00000000020e 44071 1727204599.24515: done sending task result for task 127b8e07-fff9-c964-7471-00000000020e 44071 1727204599.24519: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204599.24622: no more pending results, returning what we have 44071 1727204599.24627: results queue empty 44071 1727204599.24628: checking for any_errors_fatal 44071 1727204599.24636: done checking for any_errors_fatal 44071 1727204599.24637: checking for max_fail_percentage 44071 1727204599.24641: done checking for max_fail_percentage 44071 1727204599.24642: checking to see if all hosts have failed and the running result is not ok 44071 1727204599.24642: done checking to see if all hosts have failed 44071 1727204599.24643: getting the remaining hosts for this loop 44071 1727204599.24645: done getting the remaining hosts for this loop 44071 1727204599.24650: getting the next task for host managed-node2 44071 1727204599.24659: done getting next task for host managed-node2 44071 1727204599.24663: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204599.24772: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204599.24811: getting variables 44071 1727204599.24813: in VariableManager get_vars() 44071 1727204599.24901: Calling all_inventory to load vars for managed-node2 44071 1727204599.24905: Calling groups_inventory to load vars for managed-node2 44071 1727204599.24907: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204599.24916: Calling all_plugins_play to load vars for managed-node2 44071 1727204599.24919: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204599.24921: Calling groups_plugins_play to load vars for managed-node2 44071 1727204599.27430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204599.29975: done with get_vars() 44071 1727204599.30014: done getting variables 44071 1727204599.30099: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.083) 0:00:11.617 ***** 44071 1727204599.30135: entering _queue_task() for managed-node2/fail 44071 1727204599.30822: worker is 1 (out of 1 available) 44071 1727204599.30836: exiting _queue_task() for managed-node2/fail 44071 1727204599.30849: done queuing things up, now waiting for results queue to drain 44071 1727204599.30851: waiting for pending results... 44071 1727204599.31310: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204599.31471: in run() - task 127b8e07-fff9-c964-7471-00000000020f 44071 1727204599.31499: variable 'ansible_search_path' from source: unknown 44071 1727204599.31571: variable 'ansible_search_path' from source: unknown 44071 1727204599.31576: calling self._execute() 44071 1727204599.31695: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204599.31707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204599.31721: variable 'omit' from source: magic vars 44071 1727204599.32172: variable 'ansible_distribution_major_version' from source: facts 44071 1727204599.32196: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204599.32338: variable 'network_state' from source: role '' defaults 44071 1727204599.32384: Evaluated conditional (network_state != {}): False 44071 1727204599.32387: when evaluation is False, skipping this task 44071 1727204599.32392: _execute() done 44071 1727204599.32399: dumping result to json 44071 1727204599.32402: done dumping result, returning 44071 1727204599.32405: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-c964-7471-00000000020f] 44071 1727204599.32492: sending task result for task 127b8e07-fff9-c964-7471-00000000020f 44071 1727204599.32671: done sending task result for task 127b8e07-fff9-c964-7471-00000000020f 44071 1727204599.32675: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204599.32734: no more pending results, returning what we have 44071 1727204599.32739: results queue empty 44071 1727204599.32740: checking for any_errors_fatal 44071 1727204599.32750: done checking for any_errors_fatal 44071 1727204599.32751: checking for max_fail_percentage 44071 1727204599.32753: done checking for max_fail_percentage 44071 1727204599.32754: checking to see if all hosts have failed and the running result is not ok 44071 1727204599.32755: done checking to see if all hosts have failed 44071 1727204599.32756: getting the remaining hosts for this loop 44071 1727204599.32758: done getting the remaining hosts for this loop 44071 1727204599.32763: getting the next task for host managed-node2 44071 1727204599.32775: done getting next task for host managed-node2 44071 1727204599.32780: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204599.32787: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204599.32806: getting variables 44071 1727204599.32808: in VariableManager get_vars() 44071 1727204599.32853: Calling all_inventory to load vars for managed-node2 44071 1727204599.32857: Calling groups_inventory to load vars for managed-node2 44071 1727204599.32859: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204599.32994: Calling all_plugins_play to load vars for managed-node2 44071 1727204599.32998: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204599.33003: Calling groups_plugins_play to load vars for managed-node2 44071 1727204599.34885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204599.37316: done with get_vars() 44071 1727204599.37358: done getting variables 44071 1727204599.37435: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.073) 0:00:11.691 ***** 44071 1727204599.37488: entering _queue_task() for managed-node2/fail 44071 1727204599.38079: worker is 1 (out of 1 available) 44071 1727204599.38092: exiting _queue_task() for managed-node2/fail 44071 1727204599.38104: done queuing things up, now waiting for results queue to drain 44071 1727204599.38106: waiting for pending results... 44071 1727204599.38352: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204599.38455: in run() - task 127b8e07-fff9-c964-7471-000000000210 44071 1727204599.38479: variable 'ansible_search_path' from source: unknown 44071 1727204599.38488: variable 'ansible_search_path' from source: unknown 44071 1727204599.38535: calling self._execute() 44071 1727204599.38641: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204599.38662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204599.38683: variable 'omit' from source: magic vars 44071 1727204599.39140: variable 'ansible_distribution_major_version' from source: facts 44071 1727204599.39210: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204599.39378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204599.42094: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204599.42195: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204599.42241: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204599.42368: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204599.42373: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204599.42415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204599.42455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204599.42507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.42557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204599.42582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204599.42705: variable 'ansible_distribution_major_version' from source: facts 44071 1727204599.42727: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44071 1727204599.43086: variable 'ansible_distribution' from source: facts 44071 1727204599.43090: variable '__network_rh_distros' from source: role '' defaults 44071 1727204599.43092: Evaluated conditional (ansible_distribution in __network_rh_distros): False 44071 1727204599.43094: when evaluation is False, skipping this task 44071 1727204599.43096: _execute() done 44071 1727204599.43098: dumping result to json 44071 1727204599.43100: done dumping result, returning 44071 1727204599.43103: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-c964-7471-000000000210] 44071 1727204599.43105: sending task result for task 127b8e07-fff9-c964-7471-000000000210 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 44071 1727204599.43380: no more pending results, returning what we have 44071 1727204599.43385: results queue empty 44071 1727204599.43386: checking for any_errors_fatal 44071 1727204599.43395: done checking for any_errors_fatal 44071 1727204599.43396: checking for max_fail_percentage 44071 1727204599.43398: done checking for max_fail_percentage 44071 1727204599.43398: checking to see if all hosts have failed and the running result is not ok 44071 1727204599.43399: done checking to see if all hosts have failed 44071 1727204599.43400: getting the remaining hosts for this loop 44071 1727204599.43402: done getting the remaining hosts for this loop 44071 1727204599.43413: getting the next task for host managed-node2 44071 1727204599.43423: done getting next task for host managed-node2 44071 1727204599.43429: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204599.43434: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204599.43454: getting variables 44071 1727204599.43456: in VariableManager get_vars() 44071 1727204599.43502: Calling all_inventory to load vars for managed-node2 44071 1727204599.43505: Calling groups_inventory to load vars for managed-node2 44071 1727204599.43508: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204599.43774: done sending task result for task 127b8e07-fff9-c964-7471-000000000210 44071 1727204599.43778: WORKER PROCESS EXITING 44071 1727204599.43790: Calling all_plugins_play to load vars for managed-node2 44071 1727204599.43794: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204599.43798: Calling groups_plugins_play to load vars for managed-node2 44071 1727204599.46364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204599.48660: done with get_vars() 44071 1727204599.48707: done getting variables 44071 1727204599.48825: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.113) 0:00:11.805 ***** 44071 1727204599.48863: entering _queue_task() for managed-node2/dnf 44071 1727204599.49241: worker is 1 (out of 1 available) 44071 1727204599.49254: exiting _queue_task() for managed-node2/dnf 44071 1727204599.49273: done queuing things up, now waiting for results queue to drain 44071 1727204599.49274: waiting for pending results... 44071 1727204599.49567: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204599.49718: in run() - task 127b8e07-fff9-c964-7471-000000000211 44071 1727204599.49739: variable 'ansible_search_path' from source: unknown 44071 1727204599.49747: variable 'ansible_search_path' from source: unknown 44071 1727204599.49798: calling self._execute() 44071 1727204599.49900: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204599.49912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204599.49925: variable 'omit' from source: magic vars 44071 1727204599.50330: variable 'ansible_distribution_major_version' from source: facts 44071 1727204599.50350: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204599.50589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204599.53222: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204599.53329: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204599.53392: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204599.53433: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204599.53500: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204599.53577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204599.53624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204599.53662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.53732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204599.53745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204599.53890: variable 'ansible_distribution' from source: facts 44071 1727204599.53901: variable 'ansible_distribution_major_version' from source: facts 44071 1727204599.53913: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44071 1727204599.54068: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204599.54226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204599.54265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204599.54302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.54354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204599.54384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204599.54440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204599.54483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204599.54513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.54592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204599.54599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204599.54642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204599.54676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204599.54718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.54772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204599.54793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204599.55000: variable 'network_connections' from source: include params 44071 1727204599.55025: variable 'interface' from source: play vars 44071 1727204599.55116: variable 'interface' from source: play vars 44071 1727204599.55210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204599.55423: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204599.55572: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204599.55578: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204599.55581: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204599.55609: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204599.55640: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204599.55682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.55717: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204599.55786: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204599.56533: variable 'network_connections' from source: include params 44071 1727204599.56550: variable 'interface' from source: play vars 44071 1727204599.56686: variable 'interface' from source: play vars 44071 1727204599.56690: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204599.56698: when evaluation is False, skipping this task 44071 1727204599.56705: _execute() done 44071 1727204599.56712: dumping result to json 44071 1727204599.56720: done dumping result, returning 44071 1727204599.56733: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000000211] 44071 1727204599.56746: sending task result for task 127b8e07-fff9-c964-7471-000000000211 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204599.56927: no more pending results, returning what we have 44071 1727204599.56930: results queue empty 44071 1727204599.56931: checking for any_errors_fatal 44071 1727204599.56940: done checking for any_errors_fatal 44071 1727204599.56940: checking for max_fail_percentage 44071 1727204599.56942: done checking for max_fail_percentage 44071 1727204599.56943: checking to see if all hosts have failed and the running result is not ok 44071 1727204599.56943: done checking to see if all hosts have failed 44071 1727204599.56944: getting the remaining hosts for this loop 44071 1727204599.56945: done getting the remaining hosts for this loop 44071 1727204599.56950: getting the next task for host managed-node2 44071 1727204599.56961: done getting next task for host managed-node2 44071 1727204599.56964: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204599.56972: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204599.56986: getting variables 44071 1727204599.56988: in VariableManager get_vars() 44071 1727204599.57025: Calling all_inventory to load vars for managed-node2 44071 1727204599.57028: Calling groups_inventory to load vars for managed-node2 44071 1727204599.57029: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204599.57041: Calling all_plugins_play to load vars for managed-node2 44071 1727204599.57044: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204599.57046: Calling groups_plugins_play to load vars for managed-node2 44071 1727204599.57588: done sending task result for task 127b8e07-fff9-c964-7471-000000000211 44071 1727204599.57592: WORKER PROCESS EXITING 44071 1727204599.59401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204599.61827: done with get_vars() 44071 1727204599.61882: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204599.61982: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.131) 0:00:11.936 ***** 44071 1727204599.62018: entering _queue_task() for managed-node2/yum 44071 1727204599.62021: Creating lock for yum 44071 1727204599.62589: worker is 1 (out of 1 available) 44071 1727204599.62601: exiting _queue_task() for managed-node2/yum 44071 1727204599.62614: done queuing things up, now waiting for results queue to drain 44071 1727204599.62615: waiting for pending results... 44071 1727204599.62794: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204599.62983: in run() - task 127b8e07-fff9-c964-7471-000000000212 44071 1727204599.63005: variable 'ansible_search_path' from source: unknown 44071 1727204599.63014: variable 'ansible_search_path' from source: unknown 44071 1727204599.63074: calling self._execute() 44071 1727204599.63185: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204599.63199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204599.63215: variable 'omit' from source: magic vars 44071 1727204599.63669: variable 'ansible_distribution_major_version' from source: facts 44071 1727204599.63692: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204599.63924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204599.66779: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204599.66809: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204599.66862: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204599.66919: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204599.66955: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204599.67063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204599.67113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204599.67211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.67215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204599.67236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204599.67370: variable 'ansible_distribution_major_version' from source: facts 44071 1727204599.67395: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44071 1727204599.67405: when evaluation is False, skipping this task 44071 1727204599.67412: _execute() done 44071 1727204599.67419: dumping result to json 44071 1727204599.67444: done dumping result, returning 44071 1727204599.67460: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000000212] 44071 1727204599.67542: sending task result for task 127b8e07-fff9-c964-7471-000000000212 44071 1727204599.67636: done sending task result for task 127b8e07-fff9-c964-7471-000000000212 44071 1727204599.67644: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44071 1727204599.67707: no more pending results, returning what we have 44071 1727204599.67711: results queue empty 44071 1727204599.67712: checking for any_errors_fatal 44071 1727204599.67718: done checking for any_errors_fatal 44071 1727204599.67719: checking for max_fail_percentage 44071 1727204599.67721: done checking for max_fail_percentage 44071 1727204599.67722: checking to see if all hosts have failed and the running result is not ok 44071 1727204599.67722: done checking to see if all hosts have failed 44071 1727204599.67723: getting the remaining hosts for this loop 44071 1727204599.67725: done getting the remaining hosts for this loop 44071 1727204599.67731: getting the next task for host managed-node2 44071 1727204599.67743: done getting next task for host managed-node2 44071 1727204599.67747: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204599.67971: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204599.67989: getting variables 44071 1727204599.67990: in VariableManager get_vars() 44071 1727204599.68030: Calling all_inventory to load vars for managed-node2 44071 1727204599.68033: Calling groups_inventory to load vars for managed-node2 44071 1727204599.68035: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204599.68049: Calling all_plugins_play to load vars for managed-node2 44071 1727204599.68053: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204599.68057: Calling groups_plugins_play to load vars for managed-node2 44071 1727204599.69254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204599.70792: done with get_vars() 44071 1727204599.70824: done getting variables 44071 1727204599.70896: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.089) 0:00:12.025 ***** 44071 1727204599.70935: entering _queue_task() for managed-node2/fail 44071 1727204599.71308: worker is 1 (out of 1 available) 44071 1727204599.71321: exiting _queue_task() for managed-node2/fail 44071 1727204599.71335: done queuing things up, now waiting for results queue to drain 44071 1727204599.71339: waiting for pending results... 44071 1727204599.71649: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204599.71752: in run() - task 127b8e07-fff9-c964-7471-000000000213 44071 1727204599.71768: variable 'ansible_search_path' from source: unknown 44071 1727204599.71772: variable 'ansible_search_path' from source: unknown 44071 1727204599.71806: calling self._execute() 44071 1727204599.71886: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204599.71893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204599.71902: variable 'omit' from source: magic vars 44071 1727204599.72208: variable 'ansible_distribution_major_version' from source: facts 44071 1727204599.72221: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204599.72312: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204599.72468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204599.80268: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204599.80374: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204599.80408: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204599.80485: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204599.80564: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204599.80632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204599.80691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204599.80730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.80831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204599.80895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204599.80995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204599.81018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204599.81109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.81133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204599.81168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204599.81234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204599.81277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204599.81322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.81432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204599.81435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204599.81635: variable 'network_connections' from source: include params 44071 1727204599.81667: variable 'interface' from source: play vars 44071 1727204599.81779: variable 'interface' from source: play vars 44071 1727204599.81876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204599.82092: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204599.82147: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204599.82196: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204599.82232: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204599.82471: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204599.82475: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204599.82477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.82480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204599.82482: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204599.82862: variable 'network_connections' from source: include params 44071 1727204599.82886: variable 'interface' from source: play vars 44071 1727204599.82982: variable 'interface' from source: play vars 44071 1727204599.83049: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204599.83060: when evaluation is False, skipping this task 44071 1727204599.83071: _execute() done 44071 1727204599.83079: dumping result to json 44071 1727204599.83094: done dumping result, returning 44071 1727204599.83122: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000000213] 44071 1727204599.83131: sending task result for task 127b8e07-fff9-c964-7471-000000000213 44071 1727204599.83502: done sending task result for task 127b8e07-fff9-c964-7471-000000000213 44071 1727204599.83506: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204599.83584: no more pending results, returning what we have 44071 1727204599.83588: results queue empty 44071 1727204599.83589: checking for any_errors_fatal 44071 1727204599.83599: done checking for any_errors_fatal 44071 1727204599.83600: checking for max_fail_percentage 44071 1727204599.83601: done checking for max_fail_percentage 44071 1727204599.83602: checking to see if all hosts have failed and the running result is not ok 44071 1727204599.83603: done checking to see if all hosts have failed 44071 1727204599.83604: getting the remaining hosts for this loop 44071 1727204599.83606: done getting the remaining hosts for this loop 44071 1727204599.83611: getting the next task for host managed-node2 44071 1727204599.83621: done getting next task for host managed-node2 44071 1727204599.83625: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44071 1727204599.83631: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204599.83650: getting variables 44071 1727204599.83652: in VariableManager get_vars() 44071 1727204599.83697: Calling all_inventory to load vars for managed-node2 44071 1727204599.83700: Calling groups_inventory to load vars for managed-node2 44071 1727204599.83702: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204599.83713: Calling all_plugins_play to load vars for managed-node2 44071 1727204599.83716: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204599.83719: Calling groups_plugins_play to load vars for managed-node2 44071 1727204599.88528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204599.89743: done with get_vars() 44071 1727204599.89774: done getting variables 44071 1727204599.89816: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:19 -0400 (0:00:00.189) 0:00:12.214 ***** 44071 1727204599.89840: entering _queue_task() for managed-node2/package 44071 1727204599.90122: worker is 1 (out of 1 available) 44071 1727204599.90138: exiting _queue_task() for managed-node2/package 44071 1727204599.90153: done queuing things up, now waiting for results queue to drain 44071 1727204599.90155: waiting for pending results... 44071 1727204599.90345: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 44071 1727204599.90453: in run() - task 127b8e07-fff9-c964-7471-000000000214 44071 1727204599.90465: variable 'ansible_search_path' from source: unknown 44071 1727204599.90470: variable 'ansible_search_path' from source: unknown 44071 1727204599.90506: calling self._execute() 44071 1727204599.90588: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204599.90593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204599.90607: variable 'omit' from source: magic vars 44071 1727204599.90919: variable 'ansible_distribution_major_version' from source: facts 44071 1727204599.90930: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204599.91087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204599.91308: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204599.91346: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204599.91400: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204599.91428: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204599.91524: variable 'network_packages' from source: role '' defaults 44071 1727204599.91614: variable '__network_provider_setup' from source: role '' defaults 44071 1727204599.91627: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204599.91687: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204599.91697: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204599.91742: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204599.91882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204599.93559: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204599.93610: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204599.93652: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204599.93681: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204599.93701: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204599.93772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204599.93794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204599.93812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.93842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204599.93855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204599.93897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204599.93914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204599.93932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.93962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204599.93975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204599.94136: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204599.94225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204599.94245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204599.94263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.94294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204599.94306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204599.94379: variable 'ansible_python' from source: facts 44071 1727204599.94394: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204599.94467: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204599.94527: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204599.94623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204599.94646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204599.94664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.94692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204599.94703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204599.94740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204599.94768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204599.94786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.94812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204599.94824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204599.94931: variable 'network_connections' from source: include params 44071 1727204599.94937: variable 'interface' from source: play vars 44071 1727204599.95018: variable 'interface' from source: play vars 44071 1727204599.95083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204599.95103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204599.95125: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204599.95150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204599.95191: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204599.95391: variable 'network_connections' from source: include params 44071 1727204599.95397: variable 'interface' from source: play vars 44071 1727204599.95470: variable 'interface' from source: play vars 44071 1727204599.95514: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204599.95574: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204599.95789: variable 'network_connections' from source: include params 44071 1727204599.95793: variable 'interface' from source: play vars 44071 1727204599.95846: variable 'interface' from source: play vars 44071 1727204599.95867: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204599.95923: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204599.96138: variable 'network_connections' from source: include params 44071 1727204599.96146: variable 'interface' from source: play vars 44071 1727204599.96198: variable 'interface' from source: play vars 44071 1727204599.96249: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204599.96296: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204599.96302: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204599.96348: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204599.96503: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204599.96845: variable 'network_connections' from source: include params 44071 1727204599.96848: variable 'interface' from source: play vars 44071 1727204599.96896: variable 'interface' from source: play vars 44071 1727204599.96907: variable 'ansible_distribution' from source: facts 44071 1727204599.96910: variable '__network_rh_distros' from source: role '' defaults 44071 1727204599.96912: variable 'ansible_distribution_major_version' from source: facts 44071 1727204599.96934: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204599.97055: variable 'ansible_distribution' from source: facts 44071 1727204599.97058: variable '__network_rh_distros' from source: role '' defaults 44071 1727204599.97063: variable 'ansible_distribution_major_version' from source: facts 44071 1727204599.97071: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204599.97189: variable 'ansible_distribution' from source: facts 44071 1727204599.97193: variable '__network_rh_distros' from source: role '' defaults 44071 1727204599.97197: variable 'ansible_distribution_major_version' from source: facts 44071 1727204599.97224: variable 'network_provider' from source: set_fact 44071 1727204599.97237: variable 'ansible_facts' from source: unknown 44071 1727204599.97783: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44071 1727204599.97787: when evaluation is False, skipping this task 44071 1727204599.97790: _execute() done 44071 1727204599.97792: dumping result to json 44071 1727204599.97794: done dumping result, returning 44071 1727204599.97802: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-c964-7471-000000000214] 44071 1727204599.97807: sending task result for task 127b8e07-fff9-c964-7471-000000000214 44071 1727204599.97904: done sending task result for task 127b8e07-fff9-c964-7471-000000000214 44071 1727204599.97907: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44071 1727204599.97959: no more pending results, returning what we have 44071 1727204599.97961: results queue empty 44071 1727204599.97962: checking for any_errors_fatal 44071 1727204599.97972: done checking for any_errors_fatal 44071 1727204599.97973: checking for max_fail_percentage 44071 1727204599.97974: done checking for max_fail_percentage 44071 1727204599.97975: checking to see if all hosts have failed and the running result is not ok 44071 1727204599.97976: done checking to see if all hosts have failed 44071 1727204599.97976: getting the remaining hosts for this loop 44071 1727204599.97978: done getting the remaining hosts for this loop 44071 1727204599.97983: getting the next task for host managed-node2 44071 1727204599.97991: done getting next task for host managed-node2 44071 1727204599.97995: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204599.98000: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204599.98016: getting variables 44071 1727204599.98017: in VariableManager get_vars() 44071 1727204599.98061: Calling all_inventory to load vars for managed-node2 44071 1727204599.98064: Calling groups_inventory to load vars for managed-node2 44071 1727204599.98074: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204599.98086: Calling all_plugins_play to load vars for managed-node2 44071 1727204599.98089: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204599.98092: Calling groups_plugins_play to load vars for managed-node2 44071 1727204599.99207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204600.00432: done with get_vars() 44071 1727204600.00461: done getting variables 44071 1727204600.00514: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.107) 0:00:12.321 ***** 44071 1727204600.00552: entering _queue_task() for managed-node2/package 44071 1727204600.00831: worker is 1 (out of 1 available) 44071 1727204600.00847: exiting _queue_task() for managed-node2/package 44071 1727204600.00860: done queuing things up, now waiting for results queue to drain 44071 1727204600.00862: waiting for pending results... 44071 1727204600.01056: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204600.01179: in run() - task 127b8e07-fff9-c964-7471-000000000215 44071 1727204600.01192: variable 'ansible_search_path' from source: unknown 44071 1727204600.01196: variable 'ansible_search_path' from source: unknown 44071 1727204600.01233: calling self._execute() 44071 1727204600.01313: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204600.01317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204600.01327: variable 'omit' from source: magic vars 44071 1727204600.01635: variable 'ansible_distribution_major_version' from source: facts 44071 1727204600.01646: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204600.01740: variable 'network_state' from source: role '' defaults 44071 1727204600.01747: Evaluated conditional (network_state != {}): False 44071 1727204600.01751: when evaluation is False, skipping this task 44071 1727204600.01754: _execute() done 44071 1727204600.01759: dumping result to json 44071 1727204600.01762: done dumping result, returning 44071 1727204600.01775: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-c964-7471-000000000215] 44071 1727204600.01778: sending task result for task 127b8e07-fff9-c964-7471-000000000215 44071 1727204600.01887: done sending task result for task 127b8e07-fff9-c964-7471-000000000215 44071 1727204600.01889: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204600.01941: no more pending results, returning what we have 44071 1727204600.01945: results queue empty 44071 1727204600.01946: checking for any_errors_fatal 44071 1727204600.01952: done checking for any_errors_fatal 44071 1727204600.01953: checking for max_fail_percentage 44071 1727204600.01954: done checking for max_fail_percentage 44071 1727204600.01955: checking to see if all hosts have failed and the running result is not ok 44071 1727204600.01956: done checking to see if all hosts have failed 44071 1727204600.01957: getting the remaining hosts for this loop 44071 1727204600.01958: done getting the remaining hosts for this loop 44071 1727204600.01963: getting the next task for host managed-node2 44071 1727204600.01979: done getting next task for host managed-node2 44071 1727204600.01984: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204600.01989: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204600.02005: getting variables 44071 1727204600.02007: in VariableManager get_vars() 44071 1727204600.02045: Calling all_inventory to load vars for managed-node2 44071 1727204600.02048: Calling groups_inventory to load vars for managed-node2 44071 1727204600.02049: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204600.02059: Calling all_plugins_play to load vars for managed-node2 44071 1727204600.02061: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204600.02064: Calling groups_plugins_play to load vars for managed-node2 44071 1727204600.03078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204600.04395: done with get_vars() 44071 1727204600.04419: done getting variables 44071 1727204600.04474: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.039) 0:00:12.361 ***** 44071 1727204600.04502: entering _queue_task() for managed-node2/package 44071 1727204600.04783: worker is 1 (out of 1 available) 44071 1727204600.04798: exiting _queue_task() for managed-node2/package 44071 1727204600.04811: done queuing things up, now waiting for results queue to drain 44071 1727204600.04813: waiting for pending results... 44071 1727204600.05000: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204600.05105: in run() - task 127b8e07-fff9-c964-7471-000000000216 44071 1727204600.05117: variable 'ansible_search_path' from source: unknown 44071 1727204600.05121: variable 'ansible_search_path' from source: unknown 44071 1727204600.05160: calling self._execute() 44071 1727204600.05233: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204600.05241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204600.05248: variable 'omit' from source: magic vars 44071 1727204600.05548: variable 'ansible_distribution_major_version' from source: facts 44071 1727204600.05560: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204600.05656: variable 'network_state' from source: role '' defaults 44071 1727204600.05667: Evaluated conditional (network_state != {}): False 44071 1727204600.05670: when evaluation is False, skipping this task 44071 1727204600.05673: _execute() done 44071 1727204600.05676: dumping result to json 44071 1727204600.05678: done dumping result, returning 44071 1727204600.05693: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-c964-7471-000000000216] 44071 1727204600.05697: sending task result for task 127b8e07-fff9-c964-7471-000000000216 44071 1727204600.05801: done sending task result for task 127b8e07-fff9-c964-7471-000000000216 44071 1727204600.05804: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204600.05857: no more pending results, returning what we have 44071 1727204600.05861: results queue empty 44071 1727204600.05862: checking for any_errors_fatal 44071 1727204600.05875: done checking for any_errors_fatal 44071 1727204600.05876: checking for max_fail_percentage 44071 1727204600.05877: done checking for max_fail_percentage 44071 1727204600.05878: checking to see if all hosts have failed and the running result is not ok 44071 1727204600.05879: done checking to see if all hosts have failed 44071 1727204600.05879: getting the remaining hosts for this loop 44071 1727204600.05881: done getting the remaining hosts for this loop 44071 1727204600.05887: getting the next task for host managed-node2 44071 1727204600.05895: done getting next task for host managed-node2 44071 1727204600.05900: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204600.05905: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204600.05921: getting variables 44071 1727204600.05922: in VariableManager get_vars() 44071 1727204600.05959: Calling all_inventory to load vars for managed-node2 44071 1727204600.05962: Calling groups_inventory to load vars for managed-node2 44071 1727204600.05964: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204600.05980: Calling all_plugins_play to load vars for managed-node2 44071 1727204600.05983: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204600.05986: Calling groups_plugins_play to load vars for managed-node2 44071 1727204600.06981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204600.08187: done with get_vars() 44071 1727204600.08213: done getting variables 44071 1727204600.08304: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.038) 0:00:12.399 ***** 44071 1727204600.08331: entering _queue_task() for managed-node2/service 44071 1727204600.08332: Creating lock for service 44071 1727204600.08616: worker is 1 (out of 1 available) 44071 1727204600.08632: exiting _queue_task() for managed-node2/service 44071 1727204600.08648: done queuing things up, now waiting for results queue to drain 44071 1727204600.08650: waiting for pending results... 44071 1727204600.08844: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204600.08941: in run() - task 127b8e07-fff9-c964-7471-000000000217 44071 1727204600.08952: variable 'ansible_search_path' from source: unknown 44071 1727204600.08956: variable 'ansible_search_path' from source: unknown 44071 1727204600.08997: calling self._execute() 44071 1727204600.09070: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204600.09073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204600.09082: variable 'omit' from source: magic vars 44071 1727204600.09388: variable 'ansible_distribution_major_version' from source: facts 44071 1727204600.09399: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204600.09494: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204600.09651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204600.11341: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204600.11693: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204600.11726: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204600.11753: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204600.11775: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204600.11849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204600.11872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204600.11891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204600.11920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204600.11934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204600.11974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204600.11991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204600.12008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204600.12043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204600.12051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204600.12085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204600.12101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204600.12118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204600.12149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204600.12160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204600.12287: variable 'network_connections' from source: include params 44071 1727204600.12298: variable 'interface' from source: play vars 44071 1727204600.12358: variable 'interface' from source: play vars 44071 1727204600.12416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204600.12544: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204600.12575: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204600.12602: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204600.12625: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204600.12675: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204600.12695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204600.12713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204600.12731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204600.12784: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204600.12966: variable 'network_connections' from source: include params 44071 1727204600.12970: variable 'interface' from source: play vars 44071 1727204600.13022: variable 'interface' from source: play vars 44071 1727204600.13048: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204600.13052: when evaluation is False, skipping this task 44071 1727204600.13054: _execute() done 44071 1727204600.13057: dumping result to json 44071 1727204600.13060: done dumping result, returning 44071 1727204600.13070: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000000217] 44071 1727204600.13074: sending task result for task 127b8e07-fff9-c964-7471-000000000217 44071 1727204600.13176: done sending task result for task 127b8e07-fff9-c964-7471-000000000217 44071 1727204600.13186: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204600.13234: no more pending results, returning what we have 44071 1727204600.13240: results queue empty 44071 1727204600.13241: checking for any_errors_fatal 44071 1727204600.13250: done checking for any_errors_fatal 44071 1727204600.13251: checking for max_fail_percentage 44071 1727204600.13252: done checking for max_fail_percentage 44071 1727204600.13253: checking to see if all hosts have failed and the running result is not ok 44071 1727204600.13254: done checking to see if all hosts have failed 44071 1727204600.13254: getting the remaining hosts for this loop 44071 1727204600.13256: done getting the remaining hosts for this loop 44071 1727204600.13261: getting the next task for host managed-node2 44071 1727204600.13270: done getting next task for host managed-node2 44071 1727204600.13274: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204600.13280: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204600.13296: getting variables 44071 1727204600.13297: in VariableManager get_vars() 44071 1727204600.13334: Calling all_inventory to load vars for managed-node2 44071 1727204600.13339: Calling groups_inventory to load vars for managed-node2 44071 1727204600.13342: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204600.13352: Calling all_plugins_play to load vars for managed-node2 44071 1727204600.13355: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204600.13357: Calling groups_plugins_play to load vars for managed-node2 44071 1727204600.14524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204600.15784: done with get_vars() 44071 1727204600.15812: done getting variables 44071 1727204600.15885: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:20 -0400 (0:00:00.075) 0:00:12.475 ***** 44071 1727204600.15912: entering _queue_task() for managed-node2/service 44071 1727204600.16254: worker is 1 (out of 1 available) 44071 1727204600.16271: exiting _queue_task() for managed-node2/service 44071 1727204600.16285: done queuing things up, now waiting for results queue to drain 44071 1727204600.16287: waiting for pending results... 44071 1727204600.16485: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204600.16591: in run() - task 127b8e07-fff9-c964-7471-000000000218 44071 1727204600.16604: variable 'ansible_search_path' from source: unknown 44071 1727204600.16607: variable 'ansible_search_path' from source: unknown 44071 1727204600.16643: calling self._execute() 44071 1727204600.16716: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204600.16724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204600.16732: variable 'omit' from source: magic vars 44071 1727204600.17039: variable 'ansible_distribution_major_version' from source: facts 44071 1727204600.17049: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204600.17175: variable 'network_provider' from source: set_fact 44071 1727204600.17179: variable 'network_state' from source: role '' defaults 44071 1727204600.17188: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44071 1727204600.17195: variable 'omit' from source: magic vars 44071 1727204600.17247: variable 'omit' from source: magic vars 44071 1727204600.17272: variable 'network_service_name' from source: role '' defaults 44071 1727204600.17327: variable 'network_service_name' from source: role '' defaults 44071 1727204600.17410: variable '__network_provider_setup' from source: role '' defaults 44071 1727204600.17414: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204600.17467: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204600.17474: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204600.17521: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204600.17694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204600.20652: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204600.20725: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204600.20761: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204600.20789: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204600.20809: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204600.20883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204600.20905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204600.20925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204600.20960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204600.20973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204600.21040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204600.21075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204600.21270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204600.21274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204600.21277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204600.21442: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204600.21771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204600.21855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204600.21889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204600.21944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204600.21963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204600.22085: variable 'ansible_python' from source: facts 44071 1727204600.22109: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204600.22217: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204600.22316: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204600.22469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204600.22514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204600.22544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204600.22602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204600.22621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204600.22691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204600.23007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204600.23010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204600.23013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204600.23015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204600.23430: variable 'network_connections' from source: include params 44071 1727204600.23453: variable 'interface' from source: play vars 44071 1727204600.23652: variable 'interface' from source: play vars 44071 1727204600.23952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204600.24476: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204600.24623: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204600.24675: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204600.24735: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204600.24933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204600.24993: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204600.25105: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204600.25232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204600.25358: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204600.26170: variable 'network_connections' from source: include params 44071 1727204600.26185: variable 'interface' from source: play vars 44071 1727204600.26350: variable 'interface' from source: play vars 44071 1727204600.26877: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204600.26915: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204600.28007: variable 'network_connections' from source: include params 44071 1727204600.28147: variable 'interface' from source: play vars 44071 1727204600.28355: variable 'interface' from source: play vars 44071 1727204600.28394: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204600.28608: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204600.29340: variable 'network_connections' from source: include params 44071 1727204600.29354: variable 'interface' from source: play vars 44071 1727204600.29442: variable 'interface' from source: play vars 44071 1727204600.29634: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204600.29883: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204600.29886: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204600.29916: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204600.30777: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204600.31720: variable 'network_connections' from source: include params 44071 1727204600.31737: variable 'interface' from source: play vars 44071 1727204600.31826: variable 'interface' from source: play vars 44071 1727204600.31986: variable 'ansible_distribution' from source: facts 44071 1727204600.31995: variable '__network_rh_distros' from source: role '' defaults 44071 1727204600.32005: variable 'ansible_distribution_major_version' from source: facts 44071 1727204600.32036: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204600.32471: variable 'ansible_distribution' from source: facts 44071 1727204600.32481: variable '__network_rh_distros' from source: role '' defaults 44071 1727204600.32522: variable 'ansible_distribution_major_version' from source: facts 44071 1727204600.32533: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204600.32969: variable 'ansible_distribution' from source: facts 44071 1727204600.32980: variable '__network_rh_distros' from source: role '' defaults 44071 1727204600.32990: variable 'ansible_distribution_major_version' from source: facts 44071 1727204600.33031: variable 'network_provider' from source: set_fact 44071 1727204600.33194: variable 'omit' from source: magic vars 44071 1727204600.33230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204600.33265: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204600.33491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204600.33494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204600.33497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204600.33499: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204600.33502: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204600.33504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204600.33720: Set connection var ansible_connection to ssh 44071 1727204600.33732: Set connection var ansible_timeout to 10 44071 1727204600.33742: Set connection var ansible_pipelining to False 44071 1727204600.33752: Set connection var ansible_shell_type to sh 44071 1727204600.33924: Set connection var ansible_shell_executable to /bin/sh 44071 1727204600.33927: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204600.33930: variable 'ansible_shell_executable' from source: unknown 44071 1727204600.33932: variable 'ansible_connection' from source: unknown 44071 1727204600.33934: variable 'ansible_module_compression' from source: unknown 44071 1727204600.33936: variable 'ansible_shell_type' from source: unknown 44071 1727204600.33938: variable 'ansible_shell_executable' from source: unknown 44071 1727204600.33940: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204600.33942: variable 'ansible_pipelining' from source: unknown 44071 1727204600.33944: variable 'ansible_timeout' from source: unknown 44071 1727204600.33946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204600.34190: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204600.34214: variable 'omit' from source: magic vars 44071 1727204600.34257: starting attempt loop 44071 1727204600.34266: running the handler 44071 1727204600.34489: variable 'ansible_facts' from source: unknown 44071 1727204600.37112: _low_level_execute_command(): starting 44071 1727204600.37117: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204600.38705: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204600.38725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204600.38834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204600.38872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204600.38964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204600.39398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204600.40975: stdout chunk (state=3): >>>/root <<< 44071 1727204600.41076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204600.41151: stderr chunk (state=3): >>><<< 44071 1727204600.41352: stdout chunk (state=3): >>><<< 44071 1727204600.41356: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204600.41359: _low_level_execute_command(): starting 44071 1727204600.41362: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204600.412777-44665-186824594019364 `" && echo ansible-tmp-1727204600.412777-44665-186824594019364="` echo /root/.ansible/tmp/ansible-tmp-1727204600.412777-44665-186824594019364 `" ) && sleep 0' 44071 1727204600.42572: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204600.42589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204600.42803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204600.42808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204600.42879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204600.43018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204600.43095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204600.45089: stdout chunk (state=3): >>>ansible-tmp-1727204600.412777-44665-186824594019364=/root/.ansible/tmp/ansible-tmp-1727204600.412777-44665-186824594019364 <<< 44071 1727204600.45196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204600.45281: stderr chunk (state=3): >>><<< 44071 1727204600.45284: stdout chunk (state=3): >>><<< 44071 1727204600.45298: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204600.412777-44665-186824594019364=/root/.ansible/tmp/ansible-tmp-1727204600.412777-44665-186824594019364 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204600.45575: variable 'ansible_module_compression' from source: unknown 44071 1727204600.45580: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 44071 1727204600.45584: ANSIBALLZ: Acquiring lock 44071 1727204600.45586: ANSIBALLZ: Lock acquired: 140077513493248 44071 1727204600.45588: ANSIBALLZ: Creating module 44071 1727204600.99841: ANSIBALLZ: Writing module into payload 44071 1727204601.00051: ANSIBALLZ: Writing module 44071 1727204601.00094: ANSIBALLZ: Renaming module 44071 1727204601.00100: ANSIBALLZ: Done creating module 44071 1727204601.00124: variable 'ansible_facts' from source: unknown 44071 1727204601.00327: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204600.412777-44665-186824594019364/AnsiballZ_systemd.py 44071 1727204601.00537: Sending initial data 44071 1727204601.00540: Sent initial data (155 bytes) 44071 1727204601.01185: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204601.01196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204601.01282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204601.01301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204601.01319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204601.01426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204601.03208: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204601.03306: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204601.03388: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpej8b18on /root/.ansible/tmp/ansible-tmp-1727204600.412777-44665-186824594019364/AnsiballZ_systemd.py <<< 44071 1727204601.03392: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204600.412777-44665-186824594019364/AnsiballZ_systemd.py" <<< 44071 1727204601.03449: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpej8b18on" to remote "/root/.ansible/tmp/ansible-tmp-1727204600.412777-44665-186824594019364/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204600.412777-44665-186824594019364/AnsiballZ_systemd.py" <<< 44071 1727204601.05217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204601.05495: stderr chunk (state=3): >>><<< 44071 1727204601.05574: stdout chunk (state=3): >>><<< 44071 1727204601.05578: done transferring module to remote 44071 1727204601.05580: _low_level_execute_command(): starting 44071 1727204601.05582: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204600.412777-44665-186824594019364/ /root/.ansible/tmp/ansible-tmp-1727204600.412777-44665-186824594019364/AnsiballZ_systemd.py && sleep 0' 44071 1727204601.06520: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204601.06545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204601.06624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204601.06655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204601.06735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204601.08671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204601.08730: stdout chunk (state=3): >>><<< 44071 1727204601.08734: stderr chunk (state=3): >>><<< 44071 1727204601.08880: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204601.08884: _low_level_execute_command(): starting 44071 1727204601.08886: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204600.412777-44665-186824594019364/AnsiballZ_systemd.py && sleep 0' 44071 1727204601.09613: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204601.09772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204601.09775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204601.09778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204601.09780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204601.09782: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204601.09785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204601.09787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204601.09789: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204601.09791: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204601.09793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204601.09795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204601.09813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204601.09831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204601.09952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204601.41940: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4505600", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3534811136", "CPUUsageNSec": "1433891000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitC<<< 44071 1727204601.41953: stdout chunk (state=3): >>>ORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext":<<< 44071 1727204601.41971: stdout chunk (state=3): >>> "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44071 1727204601.43843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204601.43901: stderr chunk (state=3): >>><<< 44071 1727204601.43905: stdout chunk (state=3): >>><<< 44071 1727204601.43922: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4505600", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3534811136", "CPUUsageNSec": "1433891000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204601.44067: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204600.412777-44665-186824594019364/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204601.44074: _low_level_execute_command(): starting 44071 1727204601.44080: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204600.412777-44665-186824594019364/ > /dev/null 2>&1 && sleep 0' 44071 1727204601.44554: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204601.44558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204601.44591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204601.44594: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204601.44597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204601.44645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204601.44656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204601.44735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204601.46634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204601.46698: stderr chunk (state=3): >>><<< 44071 1727204601.46701: stdout chunk (state=3): >>><<< 44071 1727204601.46713: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204601.46721: handler run complete 44071 1727204601.46770: attempt loop complete, returning result 44071 1727204601.46773: _execute() done 44071 1727204601.46776: dumping result to json 44071 1727204601.46793: done dumping result, returning 44071 1727204601.46802: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-c964-7471-000000000218] 44071 1727204601.46807: sending task result for task 127b8e07-fff9-c964-7471-000000000218 44071 1727204601.47067: done sending task result for task 127b8e07-fff9-c964-7471-000000000218 44071 1727204601.47070: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204601.47122: no more pending results, returning what we have 44071 1727204601.47125: results queue empty 44071 1727204601.47126: checking for any_errors_fatal 44071 1727204601.47133: done checking for any_errors_fatal 44071 1727204601.47134: checking for max_fail_percentage 44071 1727204601.47135: done checking for max_fail_percentage 44071 1727204601.47136: checking to see if all hosts have failed and the running result is not ok 44071 1727204601.47137: done checking to see if all hosts have failed 44071 1727204601.47137: getting the remaining hosts for this loop 44071 1727204601.47139: done getting the remaining hosts for this loop 44071 1727204601.47144: getting the next task for host managed-node2 44071 1727204601.47151: done getting next task for host managed-node2 44071 1727204601.47155: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204601.47160: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204601.47177: getting variables 44071 1727204601.47178: in VariableManager get_vars() 44071 1727204601.47212: Calling all_inventory to load vars for managed-node2 44071 1727204601.47214: Calling groups_inventory to load vars for managed-node2 44071 1727204601.47216: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204601.47228: Calling all_plugins_play to load vars for managed-node2 44071 1727204601.47231: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204601.47233: Calling groups_plugins_play to load vars for managed-node2 44071 1727204601.48363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204601.49549: done with get_vars() 44071 1727204601.49581: done getting variables 44071 1727204601.49631: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:21 -0400 (0:00:01.337) 0:00:13.813 ***** 44071 1727204601.49665: entering _queue_task() for managed-node2/service 44071 1727204601.49945: worker is 1 (out of 1 available) 44071 1727204601.49960: exiting _queue_task() for managed-node2/service 44071 1727204601.49975: done queuing things up, now waiting for results queue to drain 44071 1727204601.49977: waiting for pending results... 44071 1727204601.50181: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204601.50288: in run() - task 127b8e07-fff9-c964-7471-000000000219 44071 1727204601.50303: variable 'ansible_search_path' from source: unknown 44071 1727204601.50308: variable 'ansible_search_path' from source: unknown 44071 1727204601.50342: calling self._execute() 44071 1727204601.50421: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204601.50425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204601.50437: variable 'omit' from source: magic vars 44071 1727204601.50751: variable 'ansible_distribution_major_version' from source: facts 44071 1727204601.50765: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204601.50855: variable 'network_provider' from source: set_fact 44071 1727204601.50861: Evaluated conditional (network_provider == "nm"): True 44071 1727204601.50932: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204601.51002: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204601.51134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204601.52798: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204601.52855: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204601.52886: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204601.52916: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204601.52940: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204601.53016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204601.53041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204601.53064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204601.53094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204601.53105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204601.53147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204601.53167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204601.53186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204601.53213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204601.53225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204601.53262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204601.53282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204601.53300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204601.53326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204601.53337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204601.53449: variable 'network_connections' from source: include params 44071 1727204601.53461: variable 'interface' from source: play vars 44071 1727204601.53524: variable 'interface' from source: play vars 44071 1727204601.53587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204601.53709: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204601.53738: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204601.53763: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204601.53789: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204601.53826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204601.53844: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204601.53864: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204601.53884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204601.53925: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204601.54112: variable 'network_connections' from source: include params 44071 1727204601.54117: variable 'interface' from source: play vars 44071 1727204601.54170: variable 'interface' from source: play vars 44071 1727204601.54201: Evaluated conditional (__network_wpa_supplicant_required): False 44071 1727204601.54204: when evaluation is False, skipping this task 44071 1727204601.54207: _execute() done 44071 1727204601.54209: dumping result to json 44071 1727204601.54214: done dumping result, returning 44071 1727204601.54222: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-c964-7471-000000000219] 44071 1727204601.54235: sending task result for task 127b8e07-fff9-c964-7471-000000000219 44071 1727204601.54326: done sending task result for task 127b8e07-fff9-c964-7471-000000000219 44071 1727204601.54329: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44071 1727204601.54395: no more pending results, returning what we have 44071 1727204601.54398: results queue empty 44071 1727204601.54399: checking for any_errors_fatal 44071 1727204601.54430: done checking for any_errors_fatal 44071 1727204601.54431: checking for max_fail_percentage 44071 1727204601.54433: done checking for max_fail_percentage 44071 1727204601.54434: checking to see if all hosts have failed and the running result is not ok 44071 1727204601.54434: done checking to see if all hosts have failed 44071 1727204601.54435: getting the remaining hosts for this loop 44071 1727204601.54437: done getting the remaining hosts for this loop 44071 1727204601.54444: getting the next task for host managed-node2 44071 1727204601.54452: done getting next task for host managed-node2 44071 1727204601.54456: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204601.54461: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204601.54478: getting variables 44071 1727204601.54480: in VariableManager get_vars() 44071 1727204601.54513: Calling all_inventory to load vars for managed-node2 44071 1727204601.54516: Calling groups_inventory to load vars for managed-node2 44071 1727204601.54518: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204601.54528: Calling all_plugins_play to load vars for managed-node2 44071 1727204601.54530: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204601.54533: Calling groups_plugins_play to load vars for managed-node2 44071 1727204601.55546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204601.56746: done with get_vars() 44071 1727204601.56777: done getting variables 44071 1727204601.56829: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:21 -0400 (0:00:00.071) 0:00:13.885 ***** 44071 1727204601.56859: entering _queue_task() for managed-node2/service 44071 1727204601.57133: worker is 1 (out of 1 available) 44071 1727204601.57149: exiting _queue_task() for managed-node2/service 44071 1727204601.57163: done queuing things up, now waiting for results queue to drain 44071 1727204601.57167: waiting for pending results... 44071 1727204601.57363: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204601.57461: in run() - task 127b8e07-fff9-c964-7471-00000000021a 44071 1727204601.57477: variable 'ansible_search_path' from source: unknown 44071 1727204601.57481: variable 'ansible_search_path' from source: unknown 44071 1727204601.57612: calling self._execute() 44071 1727204601.57618: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204601.57623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204601.57627: variable 'omit' from source: magic vars 44071 1727204601.57919: variable 'ansible_distribution_major_version' from source: facts 44071 1727204601.57931: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204601.58019: variable 'network_provider' from source: set_fact 44071 1727204601.58024: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204601.58027: when evaluation is False, skipping this task 44071 1727204601.58030: _execute() done 44071 1727204601.58035: dumping result to json 44071 1727204601.58038: done dumping result, returning 44071 1727204601.58049: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-c964-7471-00000000021a] 44071 1727204601.58054: sending task result for task 127b8e07-fff9-c964-7471-00000000021a 44071 1727204601.58155: done sending task result for task 127b8e07-fff9-c964-7471-00000000021a 44071 1727204601.58157: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204601.58206: no more pending results, returning what we have 44071 1727204601.58210: results queue empty 44071 1727204601.58211: checking for any_errors_fatal 44071 1727204601.58219: done checking for any_errors_fatal 44071 1727204601.58220: checking for max_fail_percentage 44071 1727204601.58222: done checking for max_fail_percentage 44071 1727204601.58222: checking to see if all hosts have failed and the running result is not ok 44071 1727204601.58223: done checking to see if all hosts have failed 44071 1727204601.58224: getting the remaining hosts for this loop 44071 1727204601.58225: done getting the remaining hosts for this loop 44071 1727204601.58230: getting the next task for host managed-node2 44071 1727204601.58239: done getting next task for host managed-node2 44071 1727204601.58244: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204601.58249: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204601.58268: getting variables 44071 1727204601.58270: in VariableManager get_vars() 44071 1727204601.58306: Calling all_inventory to load vars for managed-node2 44071 1727204601.58309: Calling groups_inventory to load vars for managed-node2 44071 1727204601.58311: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204601.58321: Calling all_plugins_play to load vars for managed-node2 44071 1727204601.58324: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204601.58326: Calling groups_plugins_play to load vars for managed-node2 44071 1727204601.59451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204601.60633: done with get_vars() 44071 1727204601.60664: done getting variables 44071 1727204601.60715: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:21 -0400 (0:00:00.038) 0:00:13.923 ***** 44071 1727204601.60742: entering _queue_task() for managed-node2/copy 44071 1727204601.61017: worker is 1 (out of 1 available) 44071 1727204601.61033: exiting _queue_task() for managed-node2/copy 44071 1727204601.61046: done queuing things up, now waiting for results queue to drain 44071 1727204601.61048: waiting for pending results... 44071 1727204601.61244: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204601.61342: in run() - task 127b8e07-fff9-c964-7471-00000000021b 44071 1727204601.61358: variable 'ansible_search_path' from source: unknown 44071 1727204601.61362: variable 'ansible_search_path' from source: unknown 44071 1727204601.61400: calling self._execute() 44071 1727204601.61483: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204601.61489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204601.61502: variable 'omit' from source: magic vars 44071 1727204601.61804: variable 'ansible_distribution_major_version' from source: facts 44071 1727204601.61815: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204601.61905: variable 'network_provider' from source: set_fact 44071 1727204601.61910: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204601.61914: when evaluation is False, skipping this task 44071 1727204601.61917: _execute() done 44071 1727204601.61919: dumping result to json 44071 1727204601.61926: done dumping result, returning 44071 1727204601.61938: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-c964-7471-00000000021b] 44071 1727204601.61941: sending task result for task 127b8e07-fff9-c964-7471-00000000021b 44071 1727204601.62045: done sending task result for task 127b8e07-fff9-c964-7471-00000000021b 44071 1727204601.62048: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44071 1727204601.62100: no more pending results, returning what we have 44071 1727204601.62104: results queue empty 44071 1727204601.62105: checking for any_errors_fatal 44071 1727204601.62113: done checking for any_errors_fatal 44071 1727204601.62114: checking for max_fail_percentage 44071 1727204601.62115: done checking for max_fail_percentage 44071 1727204601.62116: checking to see if all hosts have failed and the running result is not ok 44071 1727204601.62116: done checking to see if all hosts have failed 44071 1727204601.62117: getting the remaining hosts for this loop 44071 1727204601.62119: done getting the remaining hosts for this loop 44071 1727204601.62124: getting the next task for host managed-node2 44071 1727204601.62133: done getting next task for host managed-node2 44071 1727204601.62136: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204601.62142: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204601.62160: getting variables 44071 1727204601.62161: in VariableManager get_vars() 44071 1727204601.62199: Calling all_inventory to load vars for managed-node2 44071 1727204601.62201: Calling groups_inventory to load vars for managed-node2 44071 1727204601.62203: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204601.62213: Calling all_plugins_play to load vars for managed-node2 44071 1727204601.62215: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204601.62217: Calling groups_plugins_play to load vars for managed-node2 44071 1727204601.63322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204601.64519: done with get_vars() 44071 1727204601.64547: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:21 -0400 (0:00:00.038) 0:00:13.962 ***** 44071 1727204601.64621: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204601.64622: Creating lock for fedora.linux_system_roles.network_connections 44071 1727204601.64906: worker is 1 (out of 1 available) 44071 1727204601.64921: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204601.64935: done queuing things up, now waiting for results queue to drain 44071 1727204601.64936: waiting for pending results... 44071 1727204601.65131: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204601.65239: in run() - task 127b8e07-fff9-c964-7471-00000000021c 44071 1727204601.65255: variable 'ansible_search_path' from source: unknown 44071 1727204601.65259: variable 'ansible_search_path' from source: unknown 44071 1727204601.65297: calling self._execute() 44071 1727204601.65379: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204601.65384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204601.65397: variable 'omit' from source: magic vars 44071 1727204601.65704: variable 'ansible_distribution_major_version' from source: facts 44071 1727204601.65717: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204601.65721: variable 'omit' from source: magic vars 44071 1727204601.65774: variable 'omit' from source: magic vars 44071 1727204601.65901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204601.67553: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204601.67610: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204601.67639: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204601.67671: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204601.67696: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204601.67763: variable 'network_provider' from source: set_fact 44071 1727204601.67873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204601.67896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204601.67918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204601.67950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204601.67961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204601.68026: variable 'omit' from source: magic vars 44071 1727204601.68107: variable 'omit' from source: magic vars 44071 1727204601.68202: variable 'network_connections' from source: include params 44071 1727204601.68213: variable 'interface' from source: play vars 44071 1727204601.68271: variable 'interface' from source: play vars 44071 1727204601.68390: variable 'omit' from source: magic vars 44071 1727204601.68398: variable '__lsr_ansible_managed' from source: task vars 44071 1727204601.68446: variable '__lsr_ansible_managed' from source: task vars 44071 1727204601.68589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44071 1727204601.68759: Loaded config def from plugin (lookup/template) 44071 1727204601.68763: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44071 1727204601.68791: File lookup term: get_ansible_managed.j2 44071 1727204601.68794: variable 'ansible_search_path' from source: unknown 44071 1727204601.68798: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44071 1727204601.68811: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44071 1727204601.68826: variable 'ansible_search_path' from source: unknown 44071 1727204601.73267: variable 'ansible_managed' from source: unknown 44071 1727204601.73386: variable 'omit' from source: magic vars 44071 1727204601.73411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204601.73433: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204601.73452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204601.73475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204601.73484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204601.73508: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204601.73511: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204601.73514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204601.73592: Set connection var ansible_connection to ssh 44071 1727204601.73598: Set connection var ansible_timeout to 10 44071 1727204601.73604: Set connection var ansible_pipelining to False 44071 1727204601.73609: Set connection var ansible_shell_type to sh 44071 1727204601.73615: Set connection var ansible_shell_executable to /bin/sh 44071 1727204601.73622: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204601.73641: variable 'ansible_shell_executable' from source: unknown 44071 1727204601.73649: variable 'ansible_connection' from source: unknown 44071 1727204601.73652: variable 'ansible_module_compression' from source: unknown 44071 1727204601.73656: variable 'ansible_shell_type' from source: unknown 44071 1727204601.73659: variable 'ansible_shell_executable' from source: unknown 44071 1727204601.73662: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204601.73664: variable 'ansible_pipelining' from source: unknown 44071 1727204601.73678: variable 'ansible_timeout' from source: unknown 44071 1727204601.73681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204601.73784: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204601.73799: variable 'omit' from source: magic vars 44071 1727204601.73802: starting attempt loop 44071 1727204601.73805: running the handler 44071 1727204601.73807: _low_level_execute_command(): starting 44071 1727204601.73815: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204601.74373: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204601.74378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204601.74381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204601.74384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204601.74433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204601.74437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204601.74439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204601.74542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204601.76310: stdout chunk (state=3): >>>/root <<< 44071 1727204601.76424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204601.76479: stderr chunk (state=3): >>><<< 44071 1727204601.76483: stdout chunk (state=3): >>><<< 44071 1727204601.76502: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204601.76518: _low_level_execute_command(): starting 44071 1727204601.76524: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204601.7650313-44716-81792968269260 `" && echo ansible-tmp-1727204601.7650313-44716-81792968269260="` echo /root/.ansible/tmp/ansible-tmp-1727204601.7650313-44716-81792968269260 `" ) && sleep 0' 44071 1727204601.77019: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204601.77023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204601.77025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204601.77028: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204601.77030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204601.77085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204601.77089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204601.77170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204601.79149: stdout chunk (state=3): >>>ansible-tmp-1727204601.7650313-44716-81792968269260=/root/.ansible/tmp/ansible-tmp-1727204601.7650313-44716-81792968269260 <<< 44071 1727204601.79262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204601.79328: stderr chunk (state=3): >>><<< 44071 1727204601.79332: stdout chunk (state=3): >>><<< 44071 1727204601.79349: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204601.7650313-44716-81792968269260=/root/.ansible/tmp/ansible-tmp-1727204601.7650313-44716-81792968269260 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204601.79393: variable 'ansible_module_compression' from source: unknown 44071 1727204601.79437: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 44071 1727204601.79444: ANSIBALLZ: Acquiring lock 44071 1727204601.79447: ANSIBALLZ: Lock acquired: 140077507723584 44071 1727204601.79449: ANSIBALLZ: Creating module 44071 1727204601.98283: ANSIBALLZ: Writing module into payload 44071 1727204601.98521: ANSIBALLZ: Writing module 44071 1727204601.98546: ANSIBALLZ: Renaming module 44071 1727204601.98553: ANSIBALLZ: Done creating module 44071 1727204601.98576: variable 'ansible_facts' from source: unknown 44071 1727204601.98647: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204601.7650313-44716-81792968269260/AnsiballZ_network_connections.py 44071 1727204601.98764: Sending initial data 44071 1727204601.98771: Sent initial data (167 bytes) 44071 1727204601.99274: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204601.99283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204601.99285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204601.99288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204601.99339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204601.99342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204601.99345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204601.99426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204602.01166: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204602.01230: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204602.01296: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbmrhp86u /root/.ansible/tmp/ansible-tmp-1727204601.7650313-44716-81792968269260/AnsiballZ_network_connections.py <<< 44071 1727204602.01305: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204601.7650313-44716-81792968269260/AnsiballZ_network_connections.py" <<< 44071 1727204602.01367: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbmrhp86u" to remote "/root/.ansible/tmp/ansible-tmp-1727204601.7650313-44716-81792968269260/AnsiballZ_network_connections.py" <<< 44071 1727204602.01374: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204601.7650313-44716-81792968269260/AnsiballZ_network_connections.py" <<< 44071 1727204602.02214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204602.02294: stderr chunk (state=3): >>><<< 44071 1727204602.02298: stdout chunk (state=3): >>><<< 44071 1727204602.02318: done transferring module to remote 44071 1727204602.02335: _low_level_execute_command(): starting 44071 1727204602.02339: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204601.7650313-44716-81792968269260/ /root/.ansible/tmp/ansible-tmp-1727204601.7650313-44716-81792968269260/AnsiballZ_network_connections.py && sleep 0' 44071 1727204602.02825: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204602.02828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204602.02831: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204602.02833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204602.02835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204602.02887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204602.02891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204602.02969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204602.04825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204602.04886: stderr chunk (state=3): >>><<< 44071 1727204602.04890: stdout chunk (state=3): >>><<< 44071 1727204602.04904: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204602.04907: _low_level_execute_command(): starting 44071 1727204602.04913: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204601.7650313-44716-81792968269260/AnsiballZ_network_connections.py && sleep 0' 44071 1727204602.05400: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204602.05404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204602.05407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204602.05409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204602.05468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204602.05476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204602.05556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204602.35503: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f7d54dee-54f0-42d3-8296-dcee7d3104de\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44071 1727204602.37530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204602.37535: stdout chunk (state=3): >>><<< 44071 1727204602.37540: stderr chunk (state=3): >>><<< 44071 1727204602.37713: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f7d54dee-54f0-42d3-8296-dcee7d3104de\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204602.37718: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204601.7650313-44716-81792968269260/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204602.37720: _low_level_execute_command(): starting 44071 1727204602.37724: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204601.7650313-44716-81792968269260/ > /dev/null 2>&1 && sleep 0' 44071 1727204602.38297: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204602.38313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204602.38328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204602.38345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204602.38449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204602.38479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204602.38497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204602.38704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204602.42530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204602.42541: stdout chunk (state=3): >>><<< 44071 1727204602.42553: stderr chunk (state=3): >>><<< 44071 1727204602.42584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204602.42600: handler run complete 44071 1727204602.42638: attempt loop complete, returning result 44071 1727204602.42645: _execute() done 44071 1727204602.42653: dumping result to json 44071 1727204602.42662: done dumping result, returning 44071 1727204602.42688: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-c964-7471-00000000021c] 44071 1727204602.42703: sending task result for task 127b8e07-fff9-c964-7471-00000000021c changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f7d54dee-54f0-42d3-8296-dcee7d3104de 44071 1727204602.42951: no more pending results, returning what we have 44071 1727204602.42955: results queue empty 44071 1727204602.42956: checking for any_errors_fatal 44071 1727204602.42964: done checking for any_errors_fatal 44071 1727204602.42967: checking for max_fail_percentage 44071 1727204602.42968: done checking for max_fail_percentage 44071 1727204602.42969: checking to see if all hosts have failed and the running result is not ok 44071 1727204602.42970: done checking to see if all hosts have failed 44071 1727204602.42971: getting the remaining hosts for this loop 44071 1727204602.42972: done getting the remaining hosts for this loop 44071 1727204602.42977: getting the next task for host managed-node2 44071 1727204602.42987: done getting next task for host managed-node2 44071 1727204602.42992: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204602.42997: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204602.43010: getting variables 44071 1727204602.43012: in VariableManager get_vars() 44071 1727204602.43049: Calling all_inventory to load vars for managed-node2 44071 1727204602.43052: Calling groups_inventory to load vars for managed-node2 44071 1727204602.43054: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204602.43275: Calling all_plugins_play to load vars for managed-node2 44071 1727204602.43281: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204602.43288: done sending task result for task 127b8e07-fff9-c964-7471-00000000021c 44071 1727204602.43290: WORKER PROCESS EXITING 44071 1727204602.43295: Calling groups_plugins_play to load vars for managed-node2 44071 1727204602.45458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204602.47816: done with get_vars() 44071 1727204602.47862: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.833) 0:00:14.796 ***** 44071 1727204602.47971: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204602.47973: Creating lock for fedora.linux_system_roles.network_state 44071 1727204602.48392: worker is 1 (out of 1 available) 44071 1727204602.48408: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204602.48428: done queuing things up, now waiting for results queue to drain 44071 1727204602.48431: waiting for pending results... 44071 1727204602.48686: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204602.48871: in run() - task 127b8e07-fff9-c964-7471-00000000021d 44071 1727204602.48898: variable 'ansible_search_path' from source: unknown 44071 1727204602.48914: variable 'ansible_search_path' from source: unknown 44071 1727204602.49025: calling self._execute() 44071 1727204602.49075: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204602.49091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204602.49108: variable 'omit' from source: magic vars 44071 1727204602.49557: variable 'ansible_distribution_major_version' from source: facts 44071 1727204602.49586: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204602.49712: variable 'network_state' from source: role '' defaults 44071 1727204602.49728: Evaluated conditional (network_state != {}): False 44071 1727204602.49734: when evaluation is False, skipping this task 44071 1727204602.49740: _execute() done 44071 1727204602.49747: dumping result to json 44071 1727204602.49754: done dumping result, returning 44071 1727204602.49784: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-c964-7471-00000000021d] 44071 1727204602.49791: sending task result for task 127b8e07-fff9-c964-7471-00000000021d skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204602.49961: no more pending results, returning what we have 44071 1727204602.49966: results queue empty 44071 1727204602.49968: checking for any_errors_fatal 44071 1727204602.49979: done checking for any_errors_fatal 44071 1727204602.49980: checking for max_fail_percentage 44071 1727204602.49982: done checking for max_fail_percentage 44071 1727204602.49983: checking to see if all hosts have failed and the running result is not ok 44071 1727204602.49983: done checking to see if all hosts have failed 44071 1727204602.49984: getting the remaining hosts for this loop 44071 1727204602.49985: done getting the remaining hosts for this loop 44071 1727204602.49990: getting the next task for host managed-node2 44071 1727204602.50000: done getting next task for host managed-node2 44071 1727204602.50004: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204602.50011: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204602.50029: getting variables 44071 1727204602.50031: in VariableManager get_vars() 44071 1727204602.50287: Calling all_inventory to load vars for managed-node2 44071 1727204602.50291: Calling groups_inventory to load vars for managed-node2 44071 1727204602.50294: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204602.50309: Calling all_plugins_play to load vars for managed-node2 44071 1727204602.50313: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204602.50317: Calling groups_plugins_play to load vars for managed-node2 44071 1727204602.51069: done sending task result for task 127b8e07-fff9-c964-7471-00000000021d 44071 1727204602.51074: WORKER PROCESS EXITING 44071 1727204602.52348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204602.54597: done with get_vars() 44071 1727204602.54734: done getting variables 44071 1727204602.54816: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.068) 0:00:14.865 ***** 44071 1727204602.54863: entering _queue_task() for managed-node2/debug 44071 1727204602.55294: worker is 1 (out of 1 available) 44071 1727204602.55310: exiting _queue_task() for managed-node2/debug 44071 1727204602.55326: done queuing things up, now waiting for results queue to drain 44071 1727204602.55328: waiting for pending results... 44071 1727204602.55612: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204602.55787: in run() - task 127b8e07-fff9-c964-7471-00000000021e 44071 1727204602.55812: variable 'ansible_search_path' from source: unknown 44071 1727204602.55821: variable 'ansible_search_path' from source: unknown 44071 1727204602.55876: calling self._execute() 44071 1727204602.55986: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204602.56001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204602.56018: variable 'omit' from source: magic vars 44071 1727204602.56455: variable 'ansible_distribution_major_version' from source: facts 44071 1727204602.56480: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204602.56492: variable 'omit' from source: magic vars 44071 1727204602.56564: variable 'omit' from source: magic vars 44071 1727204602.56619: variable 'omit' from source: magic vars 44071 1727204602.56674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204602.56724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204602.56753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204602.56783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204602.56802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204602.56870: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204602.56874: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204602.56877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204602.56980: Set connection var ansible_connection to ssh 44071 1727204602.56993: Set connection var ansible_timeout to 10 44071 1727204602.57005: Set connection var ansible_pipelining to False 44071 1727204602.57046: Set connection var ansible_shell_type to sh 44071 1727204602.57050: Set connection var ansible_shell_executable to /bin/sh 44071 1727204602.57052: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204602.57074: variable 'ansible_shell_executable' from source: unknown 44071 1727204602.57083: variable 'ansible_connection' from source: unknown 44071 1727204602.57092: variable 'ansible_module_compression' from source: unknown 44071 1727204602.57170: variable 'ansible_shell_type' from source: unknown 44071 1727204602.57174: variable 'ansible_shell_executable' from source: unknown 44071 1727204602.57176: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204602.57179: variable 'ansible_pipelining' from source: unknown 44071 1727204602.57181: variable 'ansible_timeout' from source: unknown 44071 1727204602.57184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204602.57304: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204602.57322: variable 'omit' from source: magic vars 44071 1727204602.57333: starting attempt loop 44071 1727204602.57341: running the handler 44071 1727204602.57496: variable '__network_connections_result' from source: set_fact 44071 1727204602.57627: handler run complete 44071 1727204602.57630: attempt loop complete, returning result 44071 1727204602.57633: _execute() done 44071 1727204602.57636: dumping result to json 44071 1727204602.57638: done dumping result, returning 44071 1727204602.57641: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-c964-7471-00000000021e] 44071 1727204602.57644: sending task result for task 127b8e07-fff9-c964-7471-00000000021e 44071 1727204602.57892: done sending task result for task 127b8e07-fff9-c964-7471-00000000021e 44071 1727204602.57897: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f7d54dee-54f0-42d3-8296-dcee7d3104de" ] } 44071 1727204602.58082: no more pending results, returning what we have 44071 1727204602.58086: results queue empty 44071 1727204602.58086: checking for any_errors_fatal 44071 1727204602.58092: done checking for any_errors_fatal 44071 1727204602.58093: checking for max_fail_percentage 44071 1727204602.58094: done checking for max_fail_percentage 44071 1727204602.58095: checking to see if all hosts have failed and the running result is not ok 44071 1727204602.58096: done checking to see if all hosts have failed 44071 1727204602.58097: getting the remaining hosts for this loop 44071 1727204602.58098: done getting the remaining hosts for this loop 44071 1727204602.58102: getting the next task for host managed-node2 44071 1727204602.58109: done getting next task for host managed-node2 44071 1727204602.58113: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204602.58118: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204602.58129: getting variables 44071 1727204602.58131: in VariableManager get_vars() 44071 1727204602.58178: Calling all_inventory to load vars for managed-node2 44071 1727204602.58181: Calling groups_inventory to load vars for managed-node2 44071 1727204602.58183: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204602.58194: Calling all_plugins_play to load vars for managed-node2 44071 1727204602.58197: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204602.58200: Calling groups_plugins_play to load vars for managed-node2 44071 1727204602.59688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204602.60953: done with get_vars() 44071 1727204602.60991: done getting variables 44071 1727204602.61060: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.062) 0:00:14.927 ***** 44071 1727204602.61107: entering _queue_task() for managed-node2/debug 44071 1727204602.61570: worker is 1 (out of 1 available) 44071 1727204602.61586: exiting _queue_task() for managed-node2/debug 44071 1727204602.61602: done queuing things up, now waiting for results queue to drain 44071 1727204602.61603: waiting for pending results... 44071 1727204602.61887: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204602.62043: in run() - task 127b8e07-fff9-c964-7471-00000000021f 44071 1727204602.62065: variable 'ansible_search_path' from source: unknown 44071 1727204602.62069: variable 'ansible_search_path' from source: unknown 44071 1727204602.62109: calling self._execute() 44071 1727204602.62199: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204602.62206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204602.62216: variable 'omit' from source: magic vars 44071 1727204602.62532: variable 'ansible_distribution_major_version' from source: facts 44071 1727204602.62547: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204602.62550: variable 'omit' from source: magic vars 44071 1727204602.62600: variable 'omit' from source: magic vars 44071 1727204602.62631: variable 'omit' from source: magic vars 44071 1727204602.62674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204602.62702: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204602.62724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204602.62740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204602.62754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204602.62782: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204602.62785: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204602.62788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204602.62870: Set connection var ansible_connection to ssh 44071 1727204602.62877: Set connection var ansible_timeout to 10 44071 1727204602.62882: Set connection var ansible_pipelining to False 44071 1727204602.62888: Set connection var ansible_shell_type to sh 44071 1727204602.62894: Set connection var ansible_shell_executable to /bin/sh 44071 1727204602.62901: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204602.62920: variable 'ansible_shell_executable' from source: unknown 44071 1727204602.62923: variable 'ansible_connection' from source: unknown 44071 1727204602.62926: variable 'ansible_module_compression' from source: unknown 44071 1727204602.62930: variable 'ansible_shell_type' from source: unknown 44071 1727204602.62933: variable 'ansible_shell_executable' from source: unknown 44071 1727204602.62936: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204602.62938: variable 'ansible_pipelining' from source: unknown 44071 1727204602.62948: variable 'ansible_timeout' from source: unknown 44071 1727204602.62950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204602.63070: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204602.63079: variable 'omit' from source: magic vars 44071 1727204602.63085: starting attempt loop 44071 1727204602.63088: running the handler 44071 1727204602.63129: variable '__network_connections_result' from source: set_fact 44071 1727204602.63200: variable '__network_connections_result' from source: set_fact 44071 1727204602.63295: handler run complete 44071 1727204602.63317: attempt loop complete, returning result 44071 1727204602.63320: _execute() done 44071 1727204602.63322: dumping result to json 44071 1727204602.63325: done dumping result, returning 44071 1727204602.63334: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-c964-7471-00000000021f] 44071 1727204602.63337: sending task result for task 127b8e07-fff9-c964-7471-00000000021f 44071 1727204602.63440: done sending task result for task 127b8e07-fff9-c964-7471-00000000021f 44071 1727204602.63443: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f7d54dee-54f0-42d3-8296-dcee7d3104de\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f7d54dee-54f0-42d3-8296-dcee7d3104de" ] } } 44071 1727204602.63538: no more pending results, returning what we have 44071 1727204602.63541: results queue empty 44071 1727204602.63542: checking for any_errors_fatal 44071 1727204602.63550: done checking for any_errors_fatal 44071 1727204602.63551: checking for max_fail_percentage 44071 1727204602.63559: done checking for max_fail_percentage 44071 1727204602.63560: checking to see if all hosts have failed and the running result is not ok 44071 1727204602.63561: done checking to see if all hosts have failed 44071 1727204602.63561: getting the remaining hosts for this loop 44071 1727204602.63563: done getting the remaining hosts for this loop 44071 1727204602.63570: getting the next task for host managed-node2 44071 1727204602.63578: done getting next task for host managed-node2 44071 1727204602.63582: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204602.63586: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204602.63597: getting variables 44071 1727204602.63598: in VariableManager get_vars() 44071 1727204602.63640: Calling all_inventory to load vars for managed-node2 44071 1727204602.63642: Calling groups_inventory to load vars for managed-node2 44071 1727204602.63644: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204602.63654: Calling all_plugins_play to load vars for managed-node2 44071 1727204602.63656: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204602.63659: Calling groups_plugins_play to load vars for managed-node2 44071 1727204602.65080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204602.66681: done with get_vars() 44071 1727204602.66712: done getting variables 44071 1727204602.66763: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.056) 0:00:14.984 ***** 44071 1727204602.66795: entering _queue_task() for managed-node2/debug 44071 1727204602.67082: worker is 1 (out of 1 available) 44071 1727204602.67099: exiting _queue_task() for managed-node2/debug 44071 1727204602.67113: done queuing things up, now waiting for results queue to drain 44071 1727204602.67115: waiting for pending results... 44071 1727204602.67321: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204602.67426: in run() - task 127b8e07-fff9-c964-7471-000000000220 44071 1727204602.67440: variable 'ansible_search_path' from source: unknown 44071 1727204602.67447: variable 'ansible_search_path' from source: unknown 44071 1727204602.67485: calling self._execute() 44071 1727204602.67566: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204602.67583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204602.67587: variable 'omit' from source: magic vars 44071 1727204602.68076: variable 'ansible_distribution_major_version' from source: facts 44071 1727204602.68080: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204602.68232: variable 'network_state' from source: role '' defaults 44071 1727204602.68253: Evaluated conditional (network_state != {}): False 44071 1727204602.68288: when evaluation is False, skipping this task 44071 1727204602.68292: _execute() done 44071 1727204602.68299: dumping result to json 44071 1727204602.68303: done dumping result, returning 44071 1727204602.68306: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-c964-7471-000000000220] 44071 1727204602.68377: sending task result for task 127b8e07-fff9-c964-7471-000000000220 skipping: [managed-node2] => { "false_condition": "network_state != {}" } 44071 1727204602.68530: no more pending results, returning what we have 44071 1727204602.68535: results queue empty 44071 1727204602.68540: checking for any_errors_fatal 44071 1727204602.68552: done checking for any_errors_fatal 44071 1727204602.68552: checking for max_fail_percentage 44071 1727204602.68554: done checking for max_fail_percentage 44071 1727204602.68555: checking to see if all hosts have failed and the running result is not ok 44071 1727204602.68555: done checking to see if all hosts have failed 44071 1727204602.68556: getting the remaining hosts for this loop 44071 1727204602.68558: done getting the remaining hosts for this loop 44071 1727204602.68563: getting the next task for host managed-node2 44071 1727204602.68580: done getting next task for host managed-node2 44071 1727204602.68586: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204602.68592: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204602.68613: getting variables 44071 1727204602.68615: in VariableManager get_vars() 44071 1727204602.68659: Calling all_inventory to load vars for managed-node2 44071 1727204602.68662: Calling groups_inventory to load vars for managed-node2 44071 1727204602.68664: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204602.68681: Calling all_plugins_play to load vars for managed-node2 44071 1727204602.68684: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204602.68687: Calling groups_plugins_play to load vars for managed-node2 44071 1727204602.69356: done sending task result for task 127b8e07-fff9-c964-7471-000000000220 44071 1727204602.69360: WORKER PROCESS EXITING 44071 1727204602.70398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204602.71605: done with get_vars() 44071 1727204602.71634: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:22 -0400 (0:00:00.049) 0:00:15.033 ***** 44071 1727204602.71729: entering _queue_task() for managed-node2/ping 44071 1727204602.71730: Creating lock for ping 44071 1727204602.72125: worker is 1 (out of 1 available) 44071 1727204602.72141: exiting _queue_task() for managed-node2/ping 44071 1727204602.72154: done queuing things up, now waiting for results queue to drain 44071 1727204602.72155: waiting for pending results... 44071 1727204602.72587: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204602.72643: in run() - task 127b8e07-fff9-c964-7471-000000000221 44071 1727204602.72671: variable 'ansible_search_path' from source: unknown 44071 1727204602.72681: variable 'ansible_search_path' from source: unknown 44071 1727204602.72733: calling self._execute() 44071 1727204602.72873: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204602.72885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204602.72899: variable 'omit' from source: magic vars 44071 1727204602.73272: variable 'ansible_distribution_major_version' from source: facts 44071 1727204602.73287: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204602.73292: variable 'omit' from source: magic vars 44071 1727204602.73343: variable 'omit' from source: magic vars 44071 1727204602.73371: variable 'omit' from source: magic vars 44071 1727204602.73410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204602.73445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204602.73465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204602.73483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204602.73497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204602.73523: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204602.73527: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204602.73529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204602.73610: Set connection var ansible_connection to ssh 44071 1727204602.73614: Set connection var ansible_timeout to 10 44071 1727204602.73619: Set connection var ansible_pipelining to False 44071 1727204602.73626: Set connection var ansible_shell_type to sh 44071 1727204602.73632: Set connection var ansible_shell_executable to /bin/sh 44071 1727204602.73641: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204602.73660: variable 'ansible_shell_executable' from source: unknown 44071 1727204602.73663: variable 'ansible_connection' from source: unknown 44071 1727204602.73668: variable 'ansible_module_compression' from source: unknown 44071 1727204602.73670: variable 'ansible_shell_type' from source: unknown 44071 1727204602.73673: variable 'ansible_shell_executable' from source: unknown 44071 1727204602.73675: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204602.73680: variable 'ansible_pipelining' from source: unknown 44071 1727204602.73682: variable 'ansible_timeout' from source: unknown 44071 1727204602.73690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204602.73856: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204602.73866: variable 'omit' from source: magic vars 44071 1727204602.73872: starting attempt loop 44071 1727204602.73875: running the handler 44071 1727204602.73888: _low_level_execute_command(): starting 44071 1727204602.73894: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204602.74464: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204602.74472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204602.74476: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204602.74528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204602.74531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204602.74534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204602.74617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204602.76368: stdout chunk (state=3): >>>/root <<< 44071 1727204602.76476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204602.76543: stderr chunk (state=3): >>><<< 44071 1727204602.76546: stdout chunk (state=3): >>><<< 44071 1727204602.76568: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204602.76579: _low_level_execute_command(): starting 44071 1727204602.76585: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204602.7656658-44745-53536410835334 `" && echo ansible-tmp-1727204602.7656658-44745-53536410835334="` echo /root/.ansible/tmp/ansible-tmp-1727204602.7656658-44745-53536410835334 `" ) && sleep 0' 44071 1727204602.77101: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204602.77105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204602.77109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204602.77118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204602.77164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204602.77179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204602.77182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204602.77252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204602.79231: stdout chunk (state=3): >>>ansible-tmp-1727204602.7656658-44745-53536410835334=/root/.ansible/tmp/ansible-tmp-1727204602.7656658-44745-53536410835334 <<< 44071 1727204602.79350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204602.79415: stderr chunk (state=3): >>><<< 44071 1727204602.79419: stdout chunk (state=3): >>><<< 44071 1727204602.79436: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204602.7656658-44745-53536410835334=/root/.ansible/tmp/ansible-tmp-1727204602.7656658-44745-53536410835334 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204602.79489: variable 'ansible_module_compression' from source: unknown 44071 1727204602.79525: ANSIBALLZ: Using lock for ping 44071 1727204602.79528: ANSIBALLZ: Acquiring lock 44071 1727204602.79533: ANSIBALLZ: Lock acquired: 140077507844416 44071 1727204602.79535: ANSIBALLZ: Creating module 44071 1727204602.88330: ANSIBALLZ: Writing module into payload 44071 1727204602.88384: ANSIBALLZ: Writing module 44071 1727204602.88403: ANSIBALLZ: Renaming module 44071 1727204602.88409: ANSIBALLZ: Done creating module 44071 1727204602.88427: variable 'ansible_facts' from source: unknown 44071 1727204602.88482: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204602.7656658-44745-53536410835334/AnsiballZ_ping.py 44071 1727204602.88601: Sending initial data 44071 1727204602.88605: Sent initial data (152 bytes) 44071 1727204602.89114: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204602.89120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204602.89123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204602.89125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204602.89172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204602.89178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204602.89196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204602.89260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204602.90904: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204602.90975: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204602.91040: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpe3dzaaij /root/.ansible/tmp/ansible-tmp-1727204602.7656658-44745-53536410835334/AnsiballZ_ping.py <<< 44071 1727204602.91050: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204602.7656658-44745-53536410835334/AnsiballZ_ping.py" <<< 44071 1727204602.91113: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpe3dzaaij" to remote "/root/.ansible/tmp/ansible-tmp-1727204602.7656658-44745-53536410835334/AnsiballZ_ping.py" <<< 44071 1727204602.91116: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204602.7656658-44745-53536410835334/AnsiballZ_ping.py" <<< 44071 1727204602.91780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204602.91860: stderr chunk (state=3): >>><<< 44071 1727204602.91864: stdout chunk (state=3): >>><<< 44071 1727204602.91884: done transferring module to remote 44071 1727204602.91895: _low_level_execute_command(): starting 44071 1727204602.91900: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204602.7656658-44745-53536410835334/ /root/.ansible/tmp/ansible-tmp-1727204602.7656658-44745-53536410835334/AnsiballZ_ping.py && sleep 0' 44071 1727204602.92615: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204602.92634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204602.92692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204602.92718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204602.92772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204602.92974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204602.94780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204602.94840: stderr chunk (state=3): >>><<< 44071 1727204602.94844: stdout chunk (state=3): >>><<< 44071 1727204602.94860: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204602.94864: _low_level_execute_command(): starting 44071 1727204602.94870: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204602.7656658-44745-53536410835334/AnsiballZ_ping.py && sleep 0' 44071 1727204602.95607: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204602.95692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204602.96016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204603.12572: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44071 1727204603.13857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204603.13918: stderr chunk (state=3): >>><<< 44071 1727204603.13922: stdout chunk (state=3): >>><<< 44071 1727204603.13940: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204603.13963: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204602.7656658-44745-53536410835334/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204603.13978: _low_level_execute_command(): starting 44071 1727204603.13982: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204602.7656658-44745-53536410835334/ > /dev/null 2>&1 && sleep 0' 44071 1727204603.14679: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204603.14723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204603.14773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204603.14836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204603.14853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204603.14881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204603.14982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204603.16900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204603.16957: stderr chunk (state=3): >>><<< 44071 1727204603.16960: stdout chunk (state=3): >>><<< 44071 1727204603.16977: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204603.16986: handler run complete 44071 1727204603.17000: attempt loop complete, returning result 44071 1727204603.17008: _execute() done 44071 1727204603.17010: dumping result to json 44071 1727204603.17016: done dumping result, returning 44071 1727204603.17028: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-c964-7471-000000000221] 44071 1727204603.17031: sending task result for task 127b8e07-fff9-c964-7471-000000000221 44071 1727204603.17145: done sending task result for task 127b8e07-fff9-c964-7471-000000000221 44071 1727204603.17148: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 44071 1727204603.17219: no more pending results, returning what we have 44071 1727204603.17224: results queue empty 44071 1727204603.17225: checking for any_errors_fatal 44071 1727204603.17233: done checking for any_errors_fatal 44071 1727204603.17233: checking for max_fail_percentage 44071 1727204603.17235: done checking for max_fail_percentage 44071 1727204603.17236: checking to see if all hosts have failed and the running result is not ok 44071 1727204603.17237: done checking to see if all hosts have failed 44071 1727204603.17237: getting the remaining hosts for this loop 44071 1727204603.17239: done getting the remaining hosts for this loop 44071 1727204603.17244: getting the next task for host managed-node2 44071 1727204603.17254: done getting next task for host managed-node2 44071 1727204603.17257: ^ task is: TASK: meta (role_complete) 44071 1727204603.17262: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204603.17275: getting variables 44071 1727204603.17277: in VariableManager get_vars() 44071 1727204603.17319: Calling all_inventory to load vars for managed-node2 44071 1727204603.17322: Calling groups_inventory to load vars for managed-node2 44071 1727204603.17324: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204603.17335: Calling all_plugins_play to load vars for managed-node2 44071 1727204603.17338: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204603.17341: Calling groups_plugins_play to load vars for managed-node2 44071 1727204603.19324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204603.20614: done with get_vars() 44071 1727204603.20643: done getting variables 44071 1727204603.20713: done queuing things up, now waiting for results queue to drain 44071 1727204603.20715: results queue empty 44071 1727204603.20715: checking for any_errors_fatal 44071 1727204603.20718: done checking for any_errors_fatal 44071 1727204603.20718: checking for max_fail_percentage 44071 1727204603.20719: done checking for max_fail_percentage 44071 1727204603.20720: checking to see if all hosts have failed and the running result is not ok 44071 1727204603.20720: done checking to see if all hosts have failed 44071 1727204603.20721: getting the remaining hosts for this loop 44071 1727204603.20721: done getting the remaining hosts for this loop 44071 1727204603.20723: getting the next task for host managed-node2 44071 1727204603.20728: done getting next task for host managed-node2 44071 1727204603.20729: ^ task is: TASK: Show result 44071 1727204603.20731: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204603.20733: getting variables 44071 1727204603.20733: in VariableManager get_vars() 44071 1727204603.20744: Calling all_inventory to load vars for managed-node2 44071 1727204603.20746: Calling groups_inventory to load vars for managed-node2 44071 1727204603.20748: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204603.20752: Calling all_plugins_play to load vars for managed-node2 44071 1727204603.20754: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204603.20755: Calling groups_plugins_play to load vars for managed-node2 44071 1727204603.21799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204603.23373: done with get_vars() 44071 1727204603.23401: done getting variables 44071 1727204603.23442: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Tuesday 24 September 2024 15:03:23 -0400 (0:00:00.517) 0:00:15.551 ***** 44071 1727204603.23468: entering _queue_task() for managed-node2/debug 44071 1727204603.23758: worker is 1 (out of 1 available) 44071 1727204603.23775: exiting _queue_task() for managed-node2/debug 44071 1727204603.23789: done queuing things up, now waiting for results queue to drain 44071 1727204603.23791: waiting for pending results... 44071 1727204603.23984: running TaskExecutor() for managed-node2/TASK: Show result 44071 1727204603.24076: in run() - task 127b8e07-fff9-c964-7471-00000000018f 44071 1727204603.24089: variable 'ansible_search_path' from source: unknown 44071 1727204603.24094: variable 'ansible_search_path' from source: unknown 44071 1727204603.24130: calling self._execute() 44071 1727204603.24201: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204603.24208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204603.24217: variable 'omit' from source: magic vars 44071 1727204603.24524: variable 'ansible_distribution_major_version' from source: facts 44071 1727204603.24535: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204603.24543: variable 'omit' from source: magic vars 44071 1727204603.24585: variable 'omit' from source: magic vars 44071 1727204603.24613: variable 'omit' from source: magic vars 44071 1727204603.24650: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204603.24684: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204603.24701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204603.24718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204603.24729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204603.24755: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204603.24759: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204603.24762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204603.24844: Set connection var ansible_connection to ssh 44071 1727204603.24847: Set connection var ansible_timeout to 10 44071 1727204603.24854: Set connection var ansible_pipelining to False 44071 1727204603.24859: Set connection var ansible_shell_type to sh 44071 1727204603.24864: Set connection var ansible_shell_executable to /bin/sh 44071 1727204603.24873: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204603.24894: variable 'ansible_shell_executable' from source: unknown 44071 1727204603.24899: variable 'ansible_connection' from source: unknown 44071 1727204603.24902: variable 'ansible_module_compression' from source: unknown 44071 1727204603.24905: variable 'ansible_shell_type' from source: unknown 44071 1727204603.24908: variable 'ansible_shell_executable' from source: unknown 44071 1727204603.24910: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204603.24913: variable 'ansible_pipelining' from source: unknown 44071 1727204603.24915: variable 'ansible_timeout' from source: unknown 44071 1727204603.24921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204603.25037: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204603.25047: variable 'omit' from source: magic vars 44071 1727204603.25053: starting attempt loop 44071 1727204603.25056: running the handler 44071 1727204603.25097: variable '__network_connections_result' from source: set_fact 44071 1727204603.25166: variable '__network_connections_result' from source: set_fact 44071 1727204603.25259: handler run complete 44071 1727204603.25281: attempt loop complete, returning result 44071 1727204603.25285: _execute() done 44071 1727204603.25288: dumping result to json 44071 1727204603.25293: done dumping result, returning 44071 1727204603.25301: done running TaskExecutor() for managed-node2/TASK: Show result [127b8e07-fff9-c964-7471-00000000018f] 44071 1727204603.25304: sending task result for task 127b8e07-fff9-c964-7471-00000000018f 44071 1727204603.25405: done sending task result for task 127b8e07-fff9-c964-7471-00000000018f 44071 1727204603.25408: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f7d54dee-54f0-42d3-8296-dcee7d3104de\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f7d54dee-54f0-42d3-8296-dcee7d3104de" ] } } 44071 1727204603.25508: no more pending results, returning what we have 44071 1727204603.25511: results queue empty 44071 1727204603.25512: checking for any_errors_fatal 44071 1727204603.25513: done checking for any_errors_fatal 44071 1727204603.25514: checking for max_fail_percentage 44071 1727204603.25516: done checking for max_fail_percentage 44071 1727204603.25518: checking to see if all hosts have failed and the running result is not ok 44071 1727204603.25519: done checking to see if all hosts have failed 44071 1727204603.25520: getting the remaining hosts for this loop 44071 1727204603.25522: done getting the remaining hosts for this loop 44071 1727204603.25527: getting the next task for host managed-node2 44071 1727204603.25536: done getting next task for host managed-node2 44071 1727204603.25542: ^ task is: TASK: Asserts 44071 1727204603.25544: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204603.25549: getting variables 44071 1727204603.25550: in VariableManager get_vars() 44071 1727204603.25581: Calling all_inventory to load vars for managed-node2 44071 1727204603.25583: Calling groups_inventory to load vars for managed-node2 44071 1727204603.25586: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204603.25597: Calling all_plugins_play to load vars for managed-node2 44071 1727204603.25599: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204603.25602: Calling groups_plugins_play to load vars for managed-node2 44071 1727204603.30709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204603.31886: done with get_vars() 44071 1727204603.31911: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 15:03:23 -0400 (0:00:00.085) 0:00:15.636 ***** 44071 1727204603.31982: entering _queue_task() for managed-node2/include_tasks 44071 1727204603.32271: worker is 1 (out of 1 available) 44071 1727204603.32287: exiting _queue_task() for managed-node2/include_tasks 44071 1727204603.32299: done queuing things up, now waiting for results queue to drain 44071 1727204603.32302: waiting for pending results... 44071 1727204603.32510: running TaskExecutor() for managed-node2/TASK: Asserts 44071 1727204603.32613: in run() - task 127b8e07-fff9-c964-7471-000000000096 44071 1727204603.32624: variable 'ansible_search_path' from source: unknown 44071 1727204603.32627: variable 'ansible_search_path' from source: unknown 44071 1727204603.32677: variable 'lsr_assert' from source: include params 44071 1727204603.32875: variable 'lsr_assert' from source: include params 44071 1727204603.32930: variable 'omit' from source: magic vars 44071 1727204603.33040: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204603.33051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204603.33061: variable 'omit' from source: magic vars 44071 1727204603.33262: variable 'ansible_distribution_major_version' from source: facts 44071 1727204603.33273: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204603.33279: variable 'item' from source: unknown 44071 1727204603.33334: variable 'item' from source: unknown 44071 1727204603.33361: variable 'item' from source: unknown 44071 1727204603.33414: variable 'item' from source: unknown 44071 1727204603.33563: dumping result to json 44071 1727204603.33567: done dumping result, returning 44071 1727204603.33570: done running TaskExecutor() for managed-node2/TASK: Asserts [127b8e07-fff9-c964-7471-000000000096] 44071 1727204603.33572: sending task result for task 127b8e07-fff9-c964-7471-000000000096 44071 1727204603.33613: done sending task result for task 127b8e07-fff9-c964-7471-000000000096 44071 1727204603.33615: WORKER PROCESS EXITING 44071 1727204603.33641: no more pending results, returning what we have 44071 1727204603.33645: in VariableManager get_vars() 44071 1727204603.33684: Calling all_inventory to load vars for managed-node2 44071 1727204603.33687: Calling groups_inventory to load vars for managed-node2 44071 1727204603.33690: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204603.33705: Calling all_plugins_play to load vars for managed-node2 44071 1727204603.33707: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204603.33710: Calling groups_plugins_play to load vars for managed-node2 44071 1727204603.34814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204603.36014: done with get_vars() 44071 1727204603.36041: variable 'ansible_search_path' from source: unknown 44071 1727204603.36042: variable 'ansible_search_path' from source: unknown 44071 1727204603.36079: we have included files to process 44071 1727204603.36080: generating all_blocks data 44071 1727204603.36083: done generating all_blocks data 44071 1727204603.36087: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 44071 1727204603.36087: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 44071 1727204603.36089: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 44071 1727204603.36245: in VariableManager get_vars() 44071 1727204603.36260: done with get_vars() 44071 1727204603.36452: done processing included file 44071 1727204603.36454: iterating over new_blocks loaded from include file 44071 1727204603.36455: in VariableManager get_vars() 44071 1727204603.36467: done with get_vars() 44071 1727204603.36468: filtering new block on tags 44071 1727204603.36504: done filtering new block on tags 44071 1727204603.36505: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=tasks/assert_profile_present.yml) 44071 1727204603.36510: extending task lists for all hosts with included blocks 44071 1727204603.37235: done extending task lists 44071 1727204603.37237: done processing included files 44071 1727204603.37237: results queue empty 44071 1727204603.37238: checking for any_errors_fatal 44071 1727204603.37244: done checking for any_errors_fatal 44071 1727204603.37244: checking for max_fail_percentage 44071 1727204603.37245: done checking for max_fail_percentage 44071 1727204603.37245: checking to see if all hosts have failed and the running result is not ok 44071 1727204603.37246: done checking to see if all hosts have failed 44071 1727204603.37247: getting the remaining hosts for this loop 44071 1727204603.37248: done getting the remaining hosts for this loop 44071 1727204603.37249: getting the next task for host managed-node2 44071 1727204603.37253: done getting next task for host managed-node2 44071 1727204603.37254: ^ task is: TASK: Include the task 'get_profile_stat.yml' 44071 1727204603.37256: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204603.37258: getting variables 44071 1727204603.37259: in VariableManager get_vars() 44071 1727204603.37269: Calling all_inventory to load vars for managed-node2 44071 1727204603.37271: Calling groups_inventory to load vars for managed-node2 44071 1727204603.37273: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204603.37279: Calling all_plugins_play to load vars for managed-node2 44071 1727204603.37281: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204603.37282: Calling groups_plugins_play to load vars for managed-node2 44071 1727204603.38134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204603.39426: done with get_vars() 44071 1727204603.39448: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 15:03:23 -0400 (0:00:00.075) 0:00:15.711 ***** 44071 1727204603.39518: entering _queue_task() for managed-node2/include_tasks 44071 1727204603.39813: worker is 1 (out of 1 available) 44071 1727204603.39829: exiting _queue_task() for managed-node2/include_tasks 44071 1727204603.39842: done queuing things up, now waiting for results queue to drain 44071 1727204603.39844: waiting for pending results... 44071 1727204603.40040: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 44071 1727204603.40129: in run() - task 127b8e07-fff9-c964-7471-000000000383 44071 1727204603.40148: variable 'ansible_search_path' from source: unknown 44071 1727204603.40152: variable 'ansible_search_path' from source: unknown 44071 1727204603.40191: calling self._execute() 44071 1727204603.40274: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204603.40280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204603.40292: variable 'omit' from source: magic vars 44071 1727204603.40628: variable 'ansible_distribution_major_version' from source: facts 44071 1727204603.40639: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204603.40648: _execute() done 44071 1727204603.40651: dumping result to json 44071 1727204603.40654: done dumping result, returning 44071 1727204603.40661: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-c964-7471-000000000383] 44071 1727204603.40669: sending task result for task 127b8e07-fff9-c964-7471-000000000383 44071 1727204603.40774: done sending task result for task 127b8e07-fff9-c964-7471-000000000383 44071 1727204603.40777: WORKER PROCESS EXITING 44071 1727204603.40808: no more pending results, returning what we have 44071 1727204603.40814: in VariableManager get_vars() 44071 1727204603.40854: Calling all_inventory to load vars for managed-node2 44071 1727204603.40857: Calling groups_inventory to load vars for managed-node2 44071 1727204603.40861: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204603.40878: Calling all_plugins_play to load vars for managed-node2 44071 1727204603.40881: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204603.40884: Calling groups_plugins_play to load vars for managed-node2 44071 1727204603.41979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204603.43186: done with get_vars() 44071 1727204603.43209: variable 'ansible_search_path' from source: unknown 44071 1727204603.43210: variable 'ansible_search_path' from source: unknown 44071 1727204603.43221: variable 'item' from source: include params 44071 1727204603.43336: variable 'item' from source: include params 44071 1727204603.43377: we have included files to process 44071 1727204603.43379: generating all_blocks data 44071 1727204603.43380: done generating all_blocks data 44071 1727204603.43382: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204603.43384: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204603.43386: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204603.44526: done processing included file 44071 1727204603.44529: iterating over new_blocks loaded from include file 44071 1727204603.44530: in VariableManager get_vars() 44071 1727204603.44550: done with get_vars() 44071 1727204603.44551: filtering new block on tags 44071 1727204603.44718: done filtering new block on tags 44071 1727204603.44722: in VariableManager get_vars() 44071 1727204603.44738: done with get_vars() 44071 1727204603.44740: filtering new block on tags 44071 1727204603.44783: done filtering new block on tags 44071 1727204603.44785: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 44071 1727204603.44789: extending task lists for all hosts with included blocks 44071 1727204603.44976: done extending task lists 44071 1727204603.44977: done processing included files 44071 1727204603.44978: results queue empty 44071 1727204603.44978: checking for any_errors_fatal 44071 1727204603.44981: done checking for any_errors_fatal 44071 1727204603.44982: checking for max_fail_percentage 44071 1727204603.44982: done checking for max_fail_percentage 44071 1727204603.44983: checking to see if all hosts have failed and the running result is not ok 44071 1727204603.44983: done checking to see if all hosts have failed 44071 1727204603.44984: getting the remaining hosts for this loop 44071 1727204603.44985: done getting the remaining hosts for this loop 44071 1727204603.44987: getting the next task for host managed-node2 44071 1727204603.44990: done getting next task for host managed-node2 44071 1727204603.44992: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 44071 1727204603.44994: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204603.44996: getting variables 44071 1727204603.44996: in VariableManager get_vars() 44071 1727204603.45003: Calling all_inventory to load vars for managed-node2 44071 1727204603.45005: Calling groups_inventory to load vars for managed-node2 44071 1727204603.45006: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204603.45011: Calling all_plugins_play to load vars for managed-node2 44071 1727204603.45012: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204603.45014: Calling groups_plugins_play to load vars for managed-node2 44071 1727204603.45895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204603.47390: done with get_vars() 44071 1727204603.47427: done getting variables 44071 1727204603.47481: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:03:23 -0400 (0:00:00.079) 0:00:15.791 ***** 44071 1727204603.47512: entering _queue_task() for managed-node2/set_fact 44071 1727204603.47898: worker is 1 (out of 1 available) 44071 1727204603.47912: exiting _queue_task() for managed-node2/set_fact 44071 1727204603.47926: done queuing things up, now waiting for results queue to drain 44071 1727204603.47928: waiting for pending results... 44071 1727204603.48297: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 44071 1727204603.48349: in run() - task 127b8e07-fff9-c964-7471-0000000003fe 44071 1727204603.48502: variable 'ansible_search_path' from source: unknown 44071 1727204603.48506: variable 'ansible_search_path' from source: unknown 44071 1727204603.48510: calling self._execute() 44071 1727204603.48540: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204603.48553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204603.48570: variable 'omit' from source: magic vars 44071 1727204603.48997: variable 'ansible_distribution_major_version' from source: facts 44071 1727204603.49017: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204603.49027: variable 'omit' from source: magic vars 44071 1727204603.49096: variable 'omit' from source: magic vars 44071 1727204603.49139: variable 'omit' from source: magic vars 44071 1727204603.49194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204603.49238: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204603.49272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204603.49296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204603.49314: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204603.49352: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204603.49361: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204603.49377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204603.49493: Set connection var ansible_connection to ssh 44071 1727204603.49505: Set connection var ansible_timeout to 10 44071 1727204603.49570: Set connection var ansible_pipelining to False 44071 1727204603.49573: Set connection var ansible_shell_type to sh 44071 1727204603.49576: Set connection var ansible_shell_executable to /bin/sh 44071 1727204603.49578: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204603.49580: variable 'ansible_shell_executable' from source: unknown 44071 1727204603.49582: variable 'ansible_connection' from source: unknown 44071 1727204603.49590: variable 'ansible_module_compression' from source: unknown 44071 1727204603.49595: variable 'ansible_shell_type' from source: unknown 44071 1727204603.49604: variable 'ansible_shell_executable' from source: unknown 44071 1727204603.49610: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204603.49617: variable 'ansible_pipelining' from source: unknown 44071 1727204603.49625: variable 'ansible_timeout' from source: unknown 44071 1727204603.49633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204603.49795: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204603.49817: variable 'omit' from source: magic vars 44071 1727204603.49827: starting attempt loop 44071 1727204603.49917: running the handler 44071 1727204603.49921: handler run complete 44071 1727204603.49924: attempt loop complete, returning result 44071 1727204603.49926: _execute() done 44071 1727204603.49929: dumping result to json 44071 1727204603.49931: done dumping result, returning 44071 1727204603.49934: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-c964-7471-0000000003fe] 44071 1727204603.49936: sending task result for task 127b8e07-fff9-c964-7471-0000000003fe 44071 1727204603.50016: done sending task result for task 127b8e07-fff9-c964-7471-0000000003fe 44071 1727204603.50069: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 44071 1727204603.50137: no more pending results, returning what we have 44071 1727204603.50142: results queue empty 44071 1727204603.50144: checking for any_errors_fatal 44071 1727204603.50146: done checking for any_errors_fatal 44071 1727204603.50147: checking for max_fail_percentage 44071 1727204603.50148: done checking for max_fail_percentage 44071 1727204603.50149: checking to see if all hosts have failed and the running result is not ok 44071 1727204603.50150: done checking to see if all hosts have failed 44071 1727204603.50150: getting the remaining hosts for this loop 44071 1727204603.50152: done getting the remaining hosts for this loop 44071 1727204603.50157: getting the next task for host managed-node2 44071 1727204603.50169: done getting next task for host managed-node2 44071 1727204603.50172: ^ task is: TASK: Stat profile file 44071 1727204603.50178: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204603.50183: getting variables 44071 1727204603.50185: in VariableManager get_vars() 44071 1727204603.50220: Calling all_inventory to load vars for managed-node2 44071 1727204603.50223: Calling groups_inventory to load vars for managed-node2 44071 1727204603.50226: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204603.50241: Calling all_plugins_play to load vars for managed-node2 44071 1727204603.50244: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204603.50247: Calling groups_plugins_play to load vars for managed-node2 44071 1727204603.52338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204603.54470: done with get_vars() 44071 1727204603.54507: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:03:23 -0400 (0:00:00.070) 0:00:15.862 ***** 44071 1727204603.54615: entering _queue_task() for managed-node2/stat 44071 1727204603.54993: worker is 1 (out of 1 available) 44071 1727204603.55008: exiting _queue_task() for managed-node2/stat 44071 1727204603.55022: done queuing things up, now waiting for results queue to drain 44071 1727204603.55024: waiting for pending results... 44071 1727204603.55330: running TaskExecutor() for managed-node2/TASK: Stat profile file 44071 1727204603.55472: in run() - task 127b8e07-fff9-c964-7471-0000000003ff 44071 1727204603.55503: variable 'ansible_search_path' from source: unknown 44071 1727204603.55511: variable 'ansible_search_path' from source: unknown 44071 1727204603.55560: calling self._execute() 44071 1727204603.55669: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204603.55682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204603.55699: variable 'omit' from source: magic vars 44071 1727204603.56099: variable 'ansible_distribution_major_version' from source: facts 44071 1727204603.56118: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204603.56129: variable 'omit' from source: magic vars 44071 1727204603.56202: variable 'omit' from source: magic vars 44071 1727204603.56315: variable 'profile' from source: play vars 44071 1727204603.56325: variable 'interface' from source: play vars 44071 1727204603.56408: variable 'interface' from source: play vars 44071 1727204603.56570: variable 'omit' from source: magic vars 44071 1727204603.56573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204603.56576: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204603.56578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204603.56580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204603.56582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204603.56617: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204603.56626: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204603.56634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204603.56748: Set connection var ansible_connection to ssh 44071 1727204603.56760: Set connection var ansible_timeout to 10 44071 1727204603.56772: Set connection var ansible_pipelining to False 44071 1727204603.56781: Set connection var ansible_shell_type to sh 44071 1727204603.56790: Set connection var ansible_shell_executable to /bin/sh 44071 1727204603.56806: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204603.56835: variable 'ansible_shell_executable' from source: unknown 44071 1727204603.56843: variable 'ansible_connection' from source: unknown 44071 1727204603.56851: variable 'ansible_module_compression' from source: unknown 44071 1727204603.56858: variable 'ansible_shell_type' from source: unknown 44071 1727204603.56864: variable 'ansible_shell_executable' from source: unknown 44071 1727204603.56873: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204603.56880: variable 'ansible_pipelining' from source: unknown 44071 1727204603.56887: variable 'ansible_timeout' from source: unknown 44071 1727204603.56895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204603.57133: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204603.57241: variable 'omit' from source: magic vars 44071 1727204603.57244: starting attempt loop 44071 1727204603.57247: running the handler 44071 1727204603.57249: _low_level_execute_command(): starting 44071 1727204603.57252: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204603.57964: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204603.57982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204603.58007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204603.58084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204603.58129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204603.58146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204603.58175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204603.58283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204603.60092: stdout chunk (state=3): >>>/root <<< 44071 1727204603.60295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204603.60404: stderr chunk (state=3): >>><<< 44071 1727204603.60408: stdout chunk (state=3): >>><<< 44071 1727204603.60439: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204603.60553: _low_level_execute_command(): starting 44071 1727204603.60558: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204603.6044037-44776-101607428388569 `" && echo ansible-tmp-1727204603.6044037-44776-101607428388569="` echo /root/.ansible/tmp/ansible-tmp-1727204603.6044037-44776-101607428388569 `" ) && sleep 0' 44071 1727204603.61183: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204603.61202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204603.61222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204603.61245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204603.61272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204603.61343: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204603.61384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204603.61402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204603.61427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204603.61544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204603.63648: stdout chunk (state=3): >>>ansible-tmp-1727204603.6044037-44776-101607428388569=/root/.ansible/tmp/ansible-tmp-1727204603.6044037-44776-101607428388569 <<< 44071 1727204603.63864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204603.63881: stderr chunk (state=3): >>><<< 44071 1727204603.63890: stdout chunk (state=3): >>><<< 44071 1727204603.64175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204603.6044037-44776-101607428388569=/root/.ansible/tmp/ansible-tmp-1727204603.6044037-44776-101607428388569 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204603.64179: variable 'ansible_module_compression' from source: unknown 44071 1727204603.64181: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44071 1727204603.64320: variable 'ansible_facts' from source: unknown 44071 1727204603.64492: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204603.6044037-44776-101607428388569/AnsiballZ_stat.py 44071 1727204603.64797: Sending initial data 44071 1727204603.64808: Sent initial data (153 bytes) 44071 1727204603.65633: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204603.65683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204603.65715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204603.65727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204603.65752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204603.65852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204603.67624: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204603.67687: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204603.67789: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpfsbsmxza /root/.ansible/tmp/ansible-tmp-1727204603.6044037-44776-101607428388569/AnsiballZ_stat.py <<< 44071 1727204603.67797: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204603.6044037-44776-101607428388569/AnsiballZ_stat.py" <<< 44071 1727204603.67894: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpfsbsmxza" to remote "/root/.ansible/tmp/ansible-tmp-1727204603.6044037-44776-101607428388569/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204603.6044037-44776-101607428388569/AnsiballZ_stat.py" <<< 44071 1727204603.69548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204603.69552: stderr chunk (state=3): >>><<< 44071 1727204603.69554: stdout chunk (state=3): >>><<< 44071 1727204603.69556: done transferring module to remote 44071 1727204603.69559: _low_level_execute_command(): starting 44071 1727204603.69561: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204603.6044037-44776-101607428388569/ /root/.ansible/tmp/ansible-tmp-1727204603.6044037-44776-101607428388569/AnsiballZ_stat.py && sleep 0' 44071 1727204603.70940: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204603.70946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204603.70948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204603.70951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204603.70953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204603.70956: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204603.70958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204603.70965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204603.70970: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204603.70972: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204603.70974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204603.70984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204603.70988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204603.70990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204603.70992: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204603.70995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204603.70997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204603.70999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204603.71271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204603.71376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204603.73236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204603.73771: stderr chunk (state=3): >>><<< 44071 1727204603.73777: stdout chunk (state=3): >>><<< 44071 1727204603.73780: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204603.73782: _low_level_execute_command(): starting 44071 1727204603.73785: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204603.6044037-44776-101607428388569/AnsiballZ_stat.py && sleep 0' 44071 1727204603.74750: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204603.74756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204603.74984: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204603.75091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204603.75235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204603.91805: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44071 1727204603.93214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204603.93223: stdout chunk (state=3): >>><<< 44071 1727204603.93226: stderr chunk (state=3): >>><<< 44071 1727204603.93250: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204603.93372: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204603.6044037-44776-101607428388569/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204603.93376: _low_level_execute_command(): starting 44071 1727204603.93378: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204603.6044037-44776-101607428388569/ > /dev/null 2>&1 && sleep 0' 44071 1727204603.93980: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204603.93998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204603.94012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204603.94033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204603.94058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204603.94079: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204603.94181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204603.94197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204603.94308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204603.96431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204603.96436: stdout chunk (state=3): >>><<< 44071 1727204603.96442: stderr chunk (state=3): >>><<< 44071 1727204603.96563: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204603.96569: handler run complete 44071 1727204603.96572: attempt loop complete, returning result 44071 1727204603.96575: _execute() done 44071 1727204603.96577: dumping result to json 44071 1727204603.96579: done dumping result, returning 44071 1727204603.96581: done running TaskExecutor() for managed-node2/TASK: Stat profile file [127b8e07-fff9-c964-7471-0000000003ff] 44071 1727204603.96583: sending task result for task 127b8e07-fff9-c964-7471-0000000003ff 44071 1727204603.96875: done sending task result for task 127b8e07-fff9-c964-7471-0000000003ff 44071 1727204603.96879: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 44071 1727204603.96955: no more pending results, returning what we have 44071 1727204603.96960: results queue empty 44071 1727204603.96960: checking for any_errors_fatal 44071 1727204603.96970: done checking for any_errors_fatal 44071 1727204603.96972: checking for max_fail_percentage 44071 1727204603.96973: done checking for max_fail_percentage 44071 1727204603.96974: checking to see if all hosts have failed and the running result is not ok 44071 1727204603.96975: done checking to see if all hosts have failed 44071 1727204603.96976: getting the remaining hosts for this loop 44071 1727204603.96977: done getting the remaining hosts for this loop 44071 1727204603.96983: getting the next task for host managed-node2 44071 1727204603.96993: done getting next task for host managed-node2 44071 1727204603.96997: ^ task is: TASK: Set NM profile exist flag based on the profile files 44071 1727204603.97002: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204603.97008: getting variables 44071 1727204603.97009: in VariableManager get_vars() 44071 1727204603.97049: Calling all_inventory to load vars for managed-node2 44071 1727204603.97052: Calling groups_inventory to load vars for managed-node2 44071 1727204603.97057: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204603.97273: Calling all_plugins_play to load vars for managed-node2 44071 1727204603.97277: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204603.97282: Calling groups_plugins_play to load vars for managed-node2 44071 1727204603.99022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204604.02149: done with get_vars() 44071 1727204604.02395: done getting variables 44071 1727204604.02469: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:03:24 -0400 (0:00:00.478) 0:00:16.341 ***** 44071 1727204604.02506: entering _queue_task() for managed-node2/set_fact 44071 1727204604.03180: worker is 1 (out of 1 available) 44071 1727204604.03196: exiting _queue_task() for managed-node2/set_fact 44071 1727204604.03211: done queuing things up, now waiting for results queue to drain 44071 1727204604.03213: waiting for pending results... 44071 1727204604.03722: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 44071 1727204604.04012: in run() - task 127b8e07-fff9-c964-7471-000000000400 44071 1727204604.04095: variable 'ansible_search_path' from source: unknown 44071 1727204604.04112: variable 'ansible_search_path' from source: unknown 44071 1727204604.04162: calling self._execute() 44071 1727204604.04573: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204604.04577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204604.04582: variable 'omit' from source: magic vars 44071 1727204604.05378: variable 'ansible_distribution_major_version' from source: facts 44071 1727204604.05383: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204604.05743: variable 'profile_stat' from source: set_fact 44071 1727204604.05834: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204604.05847: when evaluation is False, skipping this task 44071 1727204604.05857: _execute() done 44071 1727204604.05929: dumping result to json 44071 1727204604.05940: done dumping result, returning 44071 1727204604.05954: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-c964-7471-000000000400] 44071 1727204604.05968: sending task result for task 127b8e07-fff9-c964-7471-000000000400 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204604.06174: no more pending results, returning what we have 44071 1727204604.06179: results queue empty 44071 1727204604.06180: checking for any_errors_fatal 44071 1727204604.06192: done checking for any_errors_fatal 44071 1727204604.06193: checking for max_fail_percentage 44071 1727204604.06195: done checking for max_fail_percentage 44071 1727204604.06196: checking to see if all hosts have failed and the running result is not ok 44071 1727204604.06196: done checking to see if all hosts have failed 44071 1727204604.06197: getting the remaining hosts for this loop 44071 1727204604.06199: done getting the remaining hosts for this loop 44071 1727204604.06205: getting the next task for host managed-node2 44071 1727204604.06213: done getting next task for host managed-node2 44071 1727204604.06216: ^ task is: TASK: Get NM profile info 44071 1727204604.06221: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204604.06227: getting variables 44071 1727204604.06229: in VariableManager get_vars() 44071 1727204604.06560: Calling all_inventory to load vars for managed-node2 44071 1727204604.06564: Calling groups_inventory to load vars for managed-node2 44071 1727204604.06572: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204604.06580: done sending task result for task 127b8e07-fff9-c964-7471-000000000400 44071 1727204604.06584: WORKER PROCESS EXITING 44071 1727204604.06600: Calling all_plugins_play to load vars for managed-node2 44071 1727204604.06603: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204604.06607: Calling groups_plugins_play to load vars for managed-node2 44071 1727204604.09035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204604.11899: done with get_vars() 44071 1727204604.11941: done getting variables 44071 1727204604.12050: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:03:24 -0400 (0:00:00.095) 0:00:16.437 ***** 44071 1727204604.12090: entering _queue_task() for managed-node2/shell 44071 1727204604.12092: Creating lock for shell 44071 1727204604.12493: worker is 1 (out of 1 available) 44071 1727204604.12508: exiting _queue_task() for managed-node2/shell 44071 1727204604.12523: done queuing things up, now waiting for results queue to drain 44071 1727204604.12524: waiting for pending results... 44071 1727204604.12832: running TaskExecutor() for managed-node2/TASK: Get NM profile info 44071 1727204604.13074: in run() - task 127b8e07-fff9-c964-7471-000000000401 44071 1727204604.13082: variable 'ansible_search_path' from source: unknown 44071 1727204604.13085: variable 'ansible_search_path' from source: unknown 44071 1727204604.13087: calling self._execute() 44071 1727204604.13131: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204604.13136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204604.13148: variable 'omit' from source: magic vars 44071 1727204604.13579: variable 'ansible_distribution_major_version' from source: facts 44071 1727204604.13592: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204604.13599: variable 'omit' from source: magic vars 44071 1727204604.13749: variable 'omit' from source: magic vars 44071 1727204604.13791: variable 'profile' from source: play vars 44071 1727204604.13795: variable 'interface' from source: play vars 44071 1727204604.13879: variable 'interface' from source: play vars 44071 1727204604.13900: variable 'omit' from source: magic vars 44071 1727204604.13952: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204604.13998: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204604.14021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204604.14040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204604.14056: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204604.14094: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204604.14098: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204604.14101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204604.14271: Set connection var ansible_connection to ssh 44071 1727204604.14274: Set connection var ansible_timeout to 10 44071 1727204604.14277: Set connection var ansible_pipelining to False 44071 1727204604.14279: Set connection var ansible_shell_type to sh 44071 1727204604.14282: Set connection var ansible_shell_executable to /bin/sh 44071 1727204604.14286: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204604.14290: variable 'ansible_shell_executable' from source: unknown 44071 1727204604.14294: variable 'ansible_connection' from source: unknown 44071 1727204604.14297: variable 'ansible_module_compression' from source: unknown 44071 1727204604.14300: variable 'ansible_shell_type' from source: unknown 44071 1727204604.14304: variable 'ansible_shell_executable' from source: unknown 44071 1727204604.14307: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204604.14372: variable 'ansible_pipelining' from source: unknown 44071 1727204604.14376: variable 'ansible_timeout' from source: unknown 44071 1727204604.14379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204604.14493: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204604.14506: variable 'omit' from source: magic vars 44071 1727204604.14532: starting attempt loop 44071 1727204604.14535: running the handler 44071 1727204604.14539: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204604.14552: _low_level_execute_command(): starting 44071 1727204604.14571: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204604.15350: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204604.15369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204604.15385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204604.15405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204604.15422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204604.15438: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204604.15482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204604.15551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204604.15573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204604.15595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204604.15679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204604.17409: stdout chunk (state=3): >>>/root <<< 44071 1727204604.17673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204604.17677: stdout chunk (state=3): >>><<< 44071 1727204604.17680: stderr chunk (state=3): >>><<< 44071 1727204604.17685: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204604.17687: _low_level_execute_command(): starting 44071 1727204604.17689: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204604.176214-44806-74141951518688 `" && echo ansible-tmp-1727204604.176214-44806-74141951518688="` echo /root/.ansible/tmp/ansible-tmp-1727204604.176214-44806-74141951518688 `" ) && sleep 0' 44071 1727204604.18494: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204604.18512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204604.18615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204604.18662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204604.18694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204604.18798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204604.20839: stdout chunk (state=3): >>>ansible-tmp-1727204604.176214-44806-74141951518688=/root/.ansible/tmp/ansible-tmp-1727204604.176214-44806-74141951518688 <<< 44071 1727204604.21049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204604.21053: stdout chunk (state=3): >>><<< 44071 1727204604.21056: stderr chunk (state=3): >>><<< 44071 1727204604.21076: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204604.176214-44806-74141951518688=/root/.ansible/tmp/ansible-tmp-1727204604.176214-44806-74141951518688 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204604.21272: variable 'ansible_module_compression' from source: unknown 44071 1727204604.21275: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204604.21278: variable 'ansible_facts' from source: unknown 44071 1727204604.21317: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204604.176214-44806-74141951518688/AnsiballZ_command.py 44071 1727204604.21523: Sending initial data 44071 1727204604.21532: Sent initial data (154 bytes) 44071 1727204604.22187: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204604.22203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204604.22217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204604.22235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204604.22278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204604.22290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204604.22380: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204604.22400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204604.22507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204604.24192: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204604.24253: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204604.24360: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpprvem92q /root/.ansible/tmp/ansible-tmp-1727204604.176214-44806-74141951518688/AnsiballZ_command.py <<< 44071 1727204604.24364: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204604.176214-44806-74141951518688/AnsiballZ_command.py" <<< 44071 1727204604.24427: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpprvem92q" to remote "/root/.ansible/tmp/ansible-tmp-1727204604.176214-44806-74141951518688/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204604.176214-44806-74141951518688/AnsiballZ_command.py" <<< 44071 1727204604.25333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204604.25374: stderr chunk (state=3): >>><<< 44071 1727204604.25383: stdout chunk (state=3): >>><<< 44071 1727204604.25414: done transferring module to remote 44071 1727204604.25431: _low_level_execute_command(): starting 44071 1727204604.25475: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204604.176214-44806-74141951518688/ /root/.ansible/tmp/ansible-tmp-1727204604.176214-44806-74141951518688/AnsiballZ_command.py && sleep 0' 44071 1727204604.26122: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204604.26139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204604.26155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204604.26238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204604.26246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204604.26294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204604.26312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204604.26341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204604.26451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204604.28439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204604.28461: stdout chunk (state=3): >>><<< 44071 1727204604.28479: stderr chunk (state=3): >>><<< 44071 1727204604.28586: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204604.28590: _low_level_execute_command(): starting 44071 1727204604.28593: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204604.176214-44806-74141951518688/AnsiballZ_command.py && sleep 0' 44071 1727204604.29221: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204604.29284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204604.29351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204604.29375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204604.29400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204604.29517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204604.48910: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:03:24.468141", "end": "2024-09-24 15:03:24.487520", "delta": "0:00:00.019379", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204604.50710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204604.50779: stderr chunk (state=3): >>><<< 44071 1727204604.50783: stdout chunk (state=3): >>><<< 44071 1727204604.50800: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:03:24.468141", "end": "2024-09-24 15:03:24.487520", "delta": "0:00:00.019379", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204604.50833: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204604.176214-44806-74141951518688/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204604.50845: _low_level_execute_command(): starting 44071 1727204604.50848: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204604.176214-44806-74141951518688/ > /dev/null 2>&1 && sleep 0' 44071 1727204604.51347: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204604.51352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204604.51354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204604.51364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204604.51424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204604.51428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204604.51440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204604.51523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204604.53572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204604.53625: stderr chunk (state=3): >>><<< 44071 1727204604.53629: stdout chunk (state=3): >>><<< 44071 1727204604.53645: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204604.53654: handler run complete 44071 1727204604.53678: Evaluated conditional (False): False 44071 1727204604.53690: attempt loop complete, returning result 44071 1727204604.53693: _execute() done 44071 1727204604.53696: dumping result to json 44071 1727204604.53701: done dumping result, returning 44071 1727204604.53709: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [127b8e07-fff9-c964-7471-000000000401] 44071 1727204604.53713: sending task result for task 127b8e07-fff9-c964-7471-000000000401 44071 1727204604.53820: done sending task result for task 127b8e07-fff9-c964-7471-000000000401 44071 1727204604.53823: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.019379", "end": "2024-09-24 15:03:24.487520", "rc": 0, "start": "2024-09-24 15:03:24.468141" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 44071 1727204604.53900: no more pending results, returning what we have 44071 1727204604.53904: results queue empty 44071 1727204604.53904: checking for any_errors_fatal 44071 1727204604.53914: done checking for any_errors_fatal 44071 1727204604.53915: checking for max_fail_percentage 44071 1727204604.53916: done checking for max_fail_percentage 44071 1727204604.53917: checking to see if all hosts have failed and the running result is not ok 44071 1727204604.53917: done checking to see if all hosts have failed 44071 1727204604.53918: getting the remaining hosts for this loop 44071 1727204604.53920: done getting the remaining hosts for this loop 44071 1727204604.53924: getting the next task for host managed-node2 44071 1727204604.53932: done getting next task for host managed-node2 44071 1727204604.53935: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44071 1727204604.53942: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204604.53946: getting variables 44071 1727204604.53948: in VariableManager get_vars() 44071 1727204604.53983: Calling all_inventory to load vars for managed-node2 44071 1727204604.53986: Calling groups_inventory to load vars for managed-node2 44071 1727204604.53990: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204604.54001: Calling all_plugins_play to load vars for managed-node2 44071 1727204604.54004: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204604.54006: Calling groups_plugins_play to load vars for managed-node2 44071 1727204604.55184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204604.56392: done with get_vars() 44071 1727204604.56422: done getting variables 44071 1727204604.56480: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:03:24 -0400 (0:00:00.444) 0:00:16.881 ***** 44071 1727204604.56506: entering _queue_task() for managed-node2/set_fact 44071 1727204604.56802: worker is 1 (out of 1 available) 44071 1727204604.56819: exiting _queue_task() for managed-node2/set_fact 44071 1727204604.56835: done queuing things up, now waiting for results queue to drain 44071 1727204604.56837: waiting for pending results... 44071 1727204604.57037: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44071 1727204604.57131: in run() - task 127b8e07-fff9-c964-7471-000000000402 44071 1727204604.57146: variable 'ansible_search_path' from source: unknown 44071 1727204604.57150: variable 'ansible_search_path' from source: unknown 44071 1727204604.57186: calling self._execute() 44071 1727204604.57260: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204604.57269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204604.57279: variable 'omit' from source: magic vars 44071 1727204604.57586: variable 'ansible_distribution_major_version' from source: facts 44071 1727204604.57597: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204604.57708: variable 'nm_profile_exists' from source: set_fact 44071 1727204604.57724: Evaluated conditional (nm_profile_exists.rc == 0): True 44071 1727204604.57728: variable 'omit' from source: magic vars 44071 1727204604.57774: variable 'omit' from source: magic vars 44071 1727204604.57800: variable 'omit' from source: magic vars 44071 1727204604.57839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204604.57873: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204604.57891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204604.57906: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204604.57917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204604.57947: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204604.57952: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204604.57954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204604.58034: Set connection var ansible_connection to ssh 44071 1727204604.58038: Set connection var ansible_timeout to 10 44071 1727204604.58050: Set connection var ansible_pipelining to False 44071 1727204604.58054: Set connection var ansible_shell_type to sh 44071 1727204604.58059: Set connection var ansible_shell_executable to /bin/sh 44071 1727204604.58068: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204604.58089: variable 'ansible_shell_executable' from source: unknown 44071 1727204604.58092: variable 'ansible_connection' from source: unknown 44071 1727204604.58095: variable 'ansible_module_compression' from source: unknown 44071 1727204604.58098: variable 'ansible_shell_type' from source: unknown 44071 1727204604.58100: variable 'ansible_shell_executable' from source: unknown 44071 1727204604.58102: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204604.58107: variable 'ansible_pipelining' from source: unknown 44071 1727204604.58109: variable 'ansible_timeout' from source: unknown 44071 1727204604.58114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204604.58232: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204604.58243: variable 'omit' from source: magic vars 44071 1727204604.58249: starting attempt loop 44071 1727204604.58252: running the handler 44071 1727204604.58268: handler run complete 44071 1727204604.58277: attempt loop complete, returning result 44071 1727204604.58279: _execute() done 44071 1727204604.58282: dumping result to json 44071 1727204604.58290: done dumping result, returning 44071 1727204604.58294: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-c964-7471-000000000402] 44071 1727204604.58299: sending task result for task 127b8e07-fff9-c964-7471-000000000402 44071 1727204604.58387: done sending task result for task 127b8e07-fff9-c964-7471-000000000402 44071 1727204604.58390: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 44071 1727204604.58449: no more pending results, returning what we have 44071 1727204604.58452: results queue empty 44071 1727204604.58453: checking for any_errors_fatal 44071 1727204604.58463: done checking for any_errors_fatal 44071 1727204604.58464: checking for max_fail_percentage 44071 1727204604.58467: done checking for max_fail_percentage 44071 1727204604.58468: checking to see if all hosts have failed and the running result is not ok 44071 1727204604.58469: done checking to see if all hosts have failed 44071 1727204604.58469: getting the remaining hosts for this loop 44071 1727204604.58472: done getting the remaining hosts for this loop 44071 1727204604.58477: getting the next task for host managed-node2 44071 1727204604.58488: done getting next task for host managed-node2 44071 1727204604.58491: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 44071 1727204604.58497: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204604.58502: getting variables 44071 1727204604.58503: in VariableManager get_vars() 44071 1727204604.58536: Calling all_inventory to load vars for managed-node2 44071 1727204604.58539: Calling groups_inventory to load vars for managed-node2 44071 1727204604.58542: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204604.58554: Calling all_plugins_play to load vars for managed-node2 44071 1727204604.58556: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204604.58559: Calling groups_plugins_play to load vars for managed-node2 44071 1727204604.59619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204604.61206: done with get_vars() 44071 1727204604.61252: done getting variables 44071 1727204604.61322: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204604.61459: variable 'profile' from source: play vars 44071 1727204604.61464: variable 'interface' from source: play vars 44071 1727204604.61535: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:03:24 -0400 (0:00:00.050) 0:00:16.932 ***** 44071 1727204604.61574: entering _queue_task() for managed-node2/command 44071 1727204604.61964: worker is 1 (out of 1 available) 44071 1727204604.61982: exiting _queue_task() for managed-node2/command 44071 1727204604.61998: done queuing things up, now waiting for results queue to drain 44071 1727204604.62000: waiting for pending results... 44071 1727204604.62206: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr 44071 1727204604.62311: in run() - task 127b8e07-fff9-c964-7471-000000000404 44071 1727204604.62324: variable 'ansible_search_path' from source: unknown 44071 1727204604.62329: variable 'ansible_search_path' from source: unknown 44071 1727204604.62367: calling self._execute() 44071 1727204604.62450: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204604.62455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204604.62459: variable 'omit' from source: magic vars 44071 1727204604.62753: variable 'ansible_distribution_major_version' from source: facts 44071 1727204604.62765: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204604.62860: variable 'profile_stat' from source: set_fact 44071 1727204604.62871: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204604.62875: when evaluation is False, skipping this task 44071 1727204604.62879: _execute() done 44071 1727204604.62882: dumping result to json 44071 1727204604.62885: done dumping result, returning 44071 1727204604.62898: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr [127b8e07-fff9-c964-7471-000000000404] 44071 1727204604.62901: sending task result for task 127b8e07-fff9-c964-7471-000000000404 44071 1727204604.62997: done sending task result for task 127b8e07-fff9-c964-7471-000000000404 44071 1727204604.63000: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204604.63061: no more pending results, returning what we have 44071 1727204604.63064: results queue empty 44071 1727204604.63067: checking for any_errors_fatal 44071 1727204604.63077: done checking for any_errors_fatal 44071 1727204604.63077: checking for max_fail_percentage 44071 1727204604.63079: done checking for max_fail_percentage 44071 1727204604.63080: checking to see if all hosts have failed and the running result is not ok 44071 1727204604.63080: done checking to see if all hosts have failed 44071 1727204604.63081: getting the remaining hosts for this loop 44071 1727204604.63083: done getting the remaining hosts for this loop 44071 1727204604.63087: getting the next task for host managed-node2 44071 1727204604.63215: done getting next task for host managed-node2 44071 1727204604.63218: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 44071 1727204604.63224: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204604.63228: getting variables 44071 1727204604.63229: in VariableManager get_vars() 44071 1727204604.63261: Calling all_inventory to load vars for managed-node2 44071 1727204604.63263: Calling groups_inventory to load vars for managed-node2 44071 1727204604.63268: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204604.63279: Calling all_plugins_play to load vars for managed-node2 44071 1727204604.63281: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204604.63283: Calling groups_plugins_play to load vars for managed-node2 44071 1727204604.67362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204604.69727: done with get_vars() 44071 1727204604.69768: done getting variables 44071 1727204604.69834: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204604.70272: variable 'profile' from source: play vars 44071 1727204604.70277: variable 'interface' from source: play vars 44071 1727204604.70342: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:03:24 -0400 (0:00:00.090) 0:00:17.022 ***** 44071 1727204604.70583: entering _queue_task() for managed-node2/set_fact 44071 1727204604.71383: worker is 1 (out of 1 available) 44071 1727204604.71395: exiting _queue_task() for managed-node2/set_fact 44071 1727204604.71408: done queuing things up, now waiting for results queue to drain 44071 1727204604.71410: waiting for pending results... 44071 1727204604.71890: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 44071 1727204604.72083: in run() - task 127b8e07-fff9-c964-7471-000000000405 44071 1727204604.72117: variable 'ansible_search_path' from source: unknown 44071 1727204604.72126: variable 'ansible_search_path' from source: unknown 44071 1727204604.72182: calling self._execute() 44071 1727204604.72296: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204604.72425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204604.72429: variable 'omit' from source: magic vars 44071 1727204604.72825: variable 'ansible_distribution_major_version' from source: facts 44071 1727204604.72856: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204604.73049: variable 'profile_stat' from source: set_fact 44071 1727204604.73072: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204604.73085: when evaluation is False, skipping this task 44071 1727204604.73092: _execute() done 44071 1727204604.73100: dumping result to json 44071 1727204604.73107: done dumping result, returning 44071 1727204604.73117: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [127b8e07-fff9-c964-7471-000000000405] 44071 1727204604.73126: sending task result for task 127b8e07-fff9-c964-7471-000000000405 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204604.73317: no more pending results, returning what we have 44071 1727204604.73321: results queue empty 44071 1727204604.73322: checking for any_errors_fatal 44071 1727204604.73329: done checking for any_errors_fatal 44071 1727204604.73330: checking for max_fail_percentage 44071 1727204604.73332: done checking for max_fail_percentage 44071 1727204604.73333: checking to see if all hosts have failed and the running result is not ok 44071 1727204604.73334: done checking to see if all hosts have failed 44071 1727204604.73334: getting the remaining hosts for this loop 44071 1727204604.73336: done getting the remaining hosts for this loop 44071 1727204604.73341: getting the next task for host managed-node2 44071 1727204604.73350: done getting next task for host managed-node2 44071 1727204604.73352: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 44071 1727204604.73357: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204604.73362: getting variables 44071 1727204604.73364: in VariableManager get_vars() 44071 1727204604.73401: Calling all_inventory to load vars for managed-node2 44071 1727204604.73404: Calling groups_inventory to load vars for managed-node2 44071 1727204604.73408: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204604.73415: done sending task result for task 127b8e07-fff9-c964-7471-000000000405 44071 1727204604.73419: WORKER PROCESS EXITING 44071 1727204604.73482: Calling all_plugins_play to load vars for managed-node2 44071 1727204604.73486: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204604.73489: Calling groups_plugins_play to load vars for managed-node2 44071 1727204604.77180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204604.80183: done with get_vars() 44071 1727204604.80227: done getting variables 44071 1727204604.80300: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204604.80423: variable 'profile' from source: play vars 44071 1727204604.80428: variable 'interface' from source: play vars 44071 1727204604.80493: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:03:24 -0400 (0:00:00.099) 0:00:17.121 ***** 44071 1727204604.80529: entering _queue_task() for managed-node2/command 44071 1727204604.81013: worker is 1 (out of 1 available) 44071 1727204604.81028: exiting _queue_task() for managed-node2/command 44071 1727204604.81041: done queuing things up, now waiting for results queue to drain 44071 1727204604.81043: waiting for pending results... 44071 1727204604.81271: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr 44071 1727204604.81574: in run() - task 127b8e07-fff9-c964-7471-000000000406 44071 1727204604.81580: variable 'ansible_search_path' from source: unknown 44071 1727204604.81584: variable 'ansible_search_path' from source: unknown 44071 1727204604.81587: calling self._execute() 44071 1727204604.81619: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204604.81633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204604.81648: variable 'omit' from source: magic vars 44071 1727204604.82063: variable 'ansible_distribution_major_version' from source: facts 44071 1727204604.82085: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204604.82225: variable 'profile_stat' from source: set_fact 44071 1727204604.82248: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204604.82256: when evaluation is False, skipping this task 44071 1727204604.82262: _execute() done 44071 1727204604.82272: dumping result to json 44071 1727204604.82279: done dumping result, returning 44071 1727204604.82290: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr [127b8e07-fff9-c964-7471-000000000406] 44071 1727204604.82298: sending task result for task 127b8e07-fff9-c964-7471-000000000406 44071 1727204604.82530: done sending task result for task 127b8e07-fff9-c964-7471-000000000406 44071 1727204604.82533: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204604.82596: no more pending results, returning what we have 44071 1727204604.82601: results queue empty 44071 1727204604.82601: checking for any_errors_fatal 44071 1727204604.82611: done checking for any_errors_fatal 44071 1727204604.82612: checking for max_fail_percentage 44071 1727204604.82613: done checking for max_fail_percentage 44071 1727204604.82614: checking to see if all hosts have failed and the running result is not ok 44071 1727204604.82614: done checking to see if all hosts have failed 44071 1727204604.82615: getting the remaining hosts for this loop 44071 1727204604.82617: done getting the remaining hosts for this loop 44071 1727204604.82622: getting the next task for host managed-node2 44071 1727204604.82632: done getting next task for host managed-node2 44071 1727204604.82634: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 44071 1727204604.82640: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204604.82643: getting variables 44071 1727204604.82645: in VariableManager get_vars() 44071 1727204604.82683: Calling all_inventory to load vars for managed-node2 44071 1727204604.82686: Calling groups_inventory to load vars for managed-node2 44071 1727204604.82691: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204604.82707: Calling all_plugins_play to load vars for managed-node2 44071 1727204604.82710: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204604.82713: Calling groups_plugins_play to load vars for managed-node2 44071 1727204604.85411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204604.87636: done with get_vars() 44071 1727204604.87677: done getting variables 44071 1727204604.87737: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204604.87857: variable 'profile' from source: play vars 44071 1727204604.87861: variable 'interface' from source: play vars 44071 1727204604.88142: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:03:24 -0400 (0:00:00.076) 0:00:17.198 ***** 44071 1727204604.88183: entering _queue_task() for managed-node2/set_fact 44071 1727204604.88881: worker is 1 (out of 1 available) 44071 1727204604.88897: exiting _queue_task() for managed-node2/set_fact 44071 1727204604.88913: done queuing things up, now waiting for results queue to drain 44071 1727204604.88914: waiting for pending results... 44071 1727204604.89791: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr 44071 1727204604.90075: in run() - task 127b8e07-fff9-c964-7471-000000000407 44071 1727204604.90080: variable 'ansible_search_path' from source: unknown 44071 1727204604.90083: variable 'ansible_search_path' from source: unknown 44071 1727204604.90087: calling self._execute() 44071 1727204604.90350: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204604.90354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204604.90366: variable 'omit' from source: magic vars 44071 1727204604.91568: variable 'ansible_distribution_major_version' from source: facts 44071 1727204604.91582: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204604.91995: variable 'profile_stat' from source: set_fact 44071 1727204604.91999: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204604.92002: when evaluation is False, skipping this task 44071 1727204604.92005: _execute() done 44071 1727204604.92007: dumping result to json 44071 1727204604.92010: done dumping result, returning 44071 1727204604.92013: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr [127b8e07-fff9-c964-7471-000000000407] 44071 1727204604.92015: sending task result for task 127b8e07-fff9-c964-7471-000000000407 44071 1727204604.92102: done sending task result for task 127b8e07-fff9-c964-7471-000000000407 44071 1727204604.92106: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204604.92167: no more pending results, returning what we have 44071 1727204604.92172: results queue empty 44071 1727204604.92173: checking for any_errors_fatal 44071 1727204604.92180: done checking for any_errors_fatal 44071 1727204604.92181: checking for max_fail_percentage 44071 1727204604.92183: done checking for max_fail_percentage 44071 1727204604.92183: checking to see if all hosts have failed and the running result is not ok 44071 1727204604.92184: done checking to see if all hosts have failed 44071 1727204604.92185: getting the remaining hosts for this loop 44071 1727204604.92187: done getting the remaining hosts for this loop 44071 1727204604.92193: getting the next task for host managed-node2 44071 1727204604.92205: done getting next task for host managed-node2 44071 1727204604.92208: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 44071 1727204604.92213: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204604.92219: getting variables 44071 1727204604.92221: in VariableManager get_vars() 44071 1727204604.92257: Calling all_inventory to load vars for managed-node2 44071 1727204604.92260: Calling groups_inventory to load vars for managed-node2 44071 1727204604.92264: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204604.92486: Calling all_plugins_play to load vars for managed-node2 44071 1727204604.92490: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204604.92493: Calling groups_plugins_play to load vars for managed-node2 44071 1727204604.94791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204604.97059: done with get_vars() 44071 1727204604.97097: done getting variables 44071 1727204604.97166: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204604.97299: variable 'profile' from source: play vars 44071 1727204604.97304: variable 'interface' from source: play vars 44071 1727204604.97372: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 15:03:24 -0400 (0:00:00.092) 0:00:17.290 ***** 44071 1727204604.97411: entering _queue_task() for managed-node2/assert 44071 1727204604.98205: worker is 1 (out of 1 available) 44071 1727204604.98218: exiting _queue_task() for managed-node2/assert 44071 1727204604.98233: done queuing things up, now waiting for results queue to drain 44071 1727204604.98234: waiting for pending results... 44071 1727204604.99090: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'statebr' 44071 1727204604.99214: in run() - task 127b8e07-fff9-c964-7471-000000000384 44071 1727204604.99221: variable 'ansible_search_path' from source: unknown 44071 1727204604.99225: variable 'ansible_search_path' from source: unknown 44071 1727204604.99228: calling self._execute() 44071 1727204604.99452: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204604.99468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204604.99649: variable 'omit' from source: magic vars 44071 1727204605.00355: variable 'ansible_distribution_major_version' from source: facts 44071 1727204605.00430: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204605.00481: variable 'omit' from source: magic vars 44071 1727204605.00603: variable 'omit' from source: magic vars 44071 1727204605.01063: variable 'profile' from source: play vars 44071 1727204605.01069: variable 'interface' from source: play vars 44071 1727204605.01071: variable 'interface' from source: play vars 44071 1727204605.01185: variable 'omit' from source: magic vars 44071 1727204605.01241: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204605.01471: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204605.01475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204605.01477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204605.01480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204605.01511: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204605.01577: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204605.01585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204605.01706: Set connection var ansible_connection to ssh 44071 1727204605.01836: Set connection var ansible_timeout to 10 44071 1727204605.01847: Set connection var ansible_pipelining to False 44071 1727204605.01857: Set connection var ansible_shell_type to sh 44071 1727204605.01868: Set connection var ansible_shell_executable to /bin/sh 44071 1727204605.01945: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204605.01980: variable 'ansible_shell_executable' from source: unknown 44071 1727204605.01987: variable 'ansible_connection' from source: unknown 44071 1727204605.01994: variable 'ansible_module_compression' from source: unknown 44071 1727204605.02000: variable 'ansible_shell_type' from source: unknown 44071 1727204605.02006: variable 'ansible_shell_executable' from source: unknown 44071 1727204605.02047: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204605.02055: variable 'ansible_pipelining' from source: unknown 44071 1727204605.02062: variable 'ansible_timeout' from source: unknown 44071 1727204605.02072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204605.02473: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204605.02477: variable 'omit' from source: magic vars 44071 1727204605.02480: starting attempt loop 44071 1727204605.02485: running the handler 44071 1727204605.02753: variable 'lsr_net_profile_exists' from source: set_fact 44071 1727204605.02767: Evaluated conditional (lsr_net_profile_exists): True 44071 1727204605.02806: handler run complete 44071 1727204605.02829: attempt loop complete, returning result 44071 1727204605.02874: _execute() done 44071 1727204605.02882: dumping result to json 44071 1727204605.02890: done dumping result, returning 44071 1727204605.02903: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'statebr' [127b8e07-fff9-c964-7471-000000000384] 44071 1727204605.02971: sending task result for task 127b8e07-fff9-c964-7471-000000000384 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204605.03264: no more pending results, returning what we have 44071 1727204605.03271: results queue empty 44071 1727204605.03271: checking for any_errors_fatal 44071 1727204605.03279: done checking for any_errors_fatal 44071 1727204605.03280: checking for max_fail_percentage 44071 1727204605.03281: done checking for max_fail_percentage 44071 1727204605.03282: checking to see if all hosts have failed and the running result is not ok 44071 1727204605.03283: done checking to see if all hosts have failed 44071 1727204605.03284: getting the remaining hosts for this loop 44071 1727204605.03286: done getting the remaining hosts for this loop 44071 1727204605.03291: getting the next task for host managed-node2 44071 1727204605.03300: done getting next task for host managed-node2 44071 1727204605.03302: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 44071 1727204605.03307: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204605.03312: getting variables 44071 1727204605.03313: in VariableManager get_vars() 44071 1727204605.03349: Calling all_inventory to load vars for managed-node2 44071 1727204605.03352: Calling groups_inventory to load vars for managed-node2 44071 1727204605.03357: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204605.03575: Calling all_plugins_play to load vars for managed-node2 44071 1727204605.03580: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204605.03586: Calling groups_plugins_play to load vars for managed-node2 44071 1727204605.04287: done sending task result for task 127b8e07-fff9-c964-7471-000000000384 44071 1727204605.04291: WORKER PROCESS EXITING 44071 1727204605.07813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204605.11291: done with get_vars() 44071 1727204605.11448: done getting variables 44071 1727204605.11517: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204605.11912: variable 'profile' from source: play vars 44071 1727204605.11918: variable 'interface' from source: play vars 44071 1727204605.12087: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.147) 0:00:17.437 ***** 44071 1727204605.12137: entering _queue_task() for managed-node2/assert 44071 1727204605.12692: worker is 1 (out of 1 available) 44071 1727204605.12704: exiting _queue_task() for managed-node2/assert 44071 1727204605.12717: done queuing things up, now waiting for results queue to drain 44071 1727204605.12719: waiting for pending results... 44071 1727204605.12951: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'statebr' 44071 1727204605.13084: in run() - task 127b8e07-fff9-c964-7471-000000000385 44071 1727204605.13174: variable 'ansible_search_path' from source: unknown 44071 1727204605.13177: variable 'ansible_search_path' from source: unknown 44071 1727204605.13181: calling self._execute() 44071 1727204605.13270: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204605.13524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204605.13528: variable 'omit' from source: magic vars 44071 1727204605.13872: variable 'ansible_distribution_major_version' from source: facts 44071 1727204605.13878: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204605.13881: variable 'omit' from source: magic vars 44071 1727204605.13885: variable 'omit' from source: magic vars 44071 1727204605.13948: variable 'profile' from source: play vars 44071 1727204605.13952: variable 'interface' from source: play vars 44071 1727204605.14037: variable 'interface' from source: play vars 44071 1727204605.14060: variable 'omit' from source: magic vars 44071 1727204605.14113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204605.14158: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204605.14227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204605.14251: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204605.14444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204605.14480: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204605.14484: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204605.14486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204605.14833: Set connection var ansible_connection to ssh 44071 1727204605.14837: Set connection var ansible_timeout to 10 44071 1727204605.14839: Set connection var ansible_pipelining to False 44071 1727204605.14841: Set connection var ansible_shell_type to sh 44071 1727204605.14844: Set connection var ansible_shell_executable to /bin/sh 44071 1727204605.14847: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204605.15093: variable 'ansible_shell_executable' from source: unknown 44071 1727204605.15096: variable 'ansible_connection' from source: unknown 44071 1727204605.15099: variable 'ansible_module_compression' from source: unknown 44071 1727204605.15102: variable 'ansible_shell_type' from source: unknown 44071 1727204605.15105: variable 'ansible_shell_executable' from source: unknown 44071 1727204605.15109: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204605.15111: variable 'ansible_pipelining' from source: unknown 44071 1727204605.15114: variable 'ansible_timeout' from source: unknown 44071 1727204605.15117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204605.15501: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204605.15507: variable 'omit' from source: magic vars 44071 1727204605.15510: starting attempt loop 44071 1727204605.15514: running the handler 44071 1727204605.15972: variable 'lsr_net_profile_ansible_managed' from source: set_fact 44071 1727204605.15976: Evaluated conditional (lsr_net_profile_ansible_managed): True 44071 1727204605.15979: handler run complete 44071 1727204605.15982: attempt loop complete, returning result 44071 1727204605.15984: _execute() done 44071 1727204605.15987: dumping result to json 44071 1727204605.15989: done dumping result, returning 44071 1727204605.15991: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'statebr' [127b8e07-fff9-c964-7471-000000000385] 44071 1727204605.15993: sending task result for task 127b8e07-fff9-c964-7471-000000000385 44071 1727204605.16101: done sending task result for task 127b8e07-fff9-c964-7471-000000000385 44071 1727204605.16106: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204605.16179: no more pending results, returning what we have 44071 1727204605.16184: results queue empty 44071 1727204605.16185: checking for any_errors_fatal 44071 1727204605.16193: done checking for any_errors_fatal 44071 1727204605.16194: checking for max_fail_percentage 44071 1727204605.16196: done checking for max_fail_percentage 44071 1727204605.16197: checking to see if all hosts have failed and the running result is not ok 44071 1727204605.16197: done checking to see if all hosts have failed 44071 1727204605.16198: getting the remaining hosts for this loop 44071 1727204605.16200: done getting the remaining hosts for this loop 44071 1727204605.16206: getting the next task for host managed-node2 44071 1727204605.16216: done getting next task for host managed-node2 44071 1727204605.16219: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 44071 1727204605.16224: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204605.16229: getting variables 44071 1727204605.16231: in VariableManager get_vars() 44071 1727204605.16579: Calling all_inventory to load vars for managed-node2 44071 1727204605.16584: Calling groups_inventory to load vars for managed-node2 44071 1727204605.16589: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204605.16605: Calling all_plugins_play to load vars for managed-node2 44071 1727204605.16609: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204605.16613: Calling groups_plugins_play to load vars for managed-node2 44071 1727204605.19350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204605.23432: done with get_vars() 44071 1727204605.23479: done getting variables 44071 1727204605.23560: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204605.23703: variable 'profile' from source: play vars 44071 1727204605.23708: variable 'interface' from source: play vars 44071 1727204605.23799: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.116) 0:00:17.554 ***** 44071 1727204605.23840: entering _queue_task() for managed-node2/assert 44071 1727204605.24293: worker is 1 (out of 1 available) 44071 1727204605.24309: exiting _queue_task() for managed-node2/assert 44071 1727204605.24324: done queuing things up, now waiting for results queue to drain 44071 1727204605.24326: waiting for pending results... 44071 1727204605.24686: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in statebr 44071 1727204605.24726: in run() - task 127b8e07-fff9-c964-7471-000000000386 44071 1727204605.24751: variable 'ansible_search_path' from source: unknown 44071 1727204605.24759: variable 'ansible_search_path' from source: unknown 44071 1727204605.24805: calling self._execute() 44071 1727204605.24906: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204605.24919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204605.24938: variable 'omit' from source: magic vars 44071 1727204605.25364: variable 'ansible_distribution_major_version' from source: facts 44071 1727204605.25673: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204605.25676: variable 'omit' from source: magic vars 44071 1727204605.25679: variable 'omit' from source: magic vars 44071 1727204605.25914: variable 'profile' from source: play vars 44071 1727204605.25926: variable 'interface' from source: play vars 44071 1727204605.26171: variable 'interface' from source: play vars 44071 1727204605.26175: variable 'omit' from source: magic vars 44071 1727204605.26371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204605.26375: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204605.26377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204605.26380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204605.26382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204605.26672: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204605.26675: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204605.26678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204605.26725: Set connection var ansible_connection to ssh 44071 1727204605.26737: Set connection var ansible_timeout to 10 44071 1727204605.26747: Set connection var ansible_pipelining to False 44071 1727204605.26757: Set connection var ansible_shell_type to sh 44071 1727204605.26768: Set connection var ansible_shell_executable to /bin/sh 44071 1727204605.26810: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204605.26903: variable 'ansible_shell_executable' from source: unknown 44071 1727204605.27026: variable 'ansible_connection' from source: unknown 44071 1727204605.27172: variable 'ansible_module_compression' from source: unknown 44071 1727204605.27175: variable 'ansible_shell_type' from source: unknown 44071 1727204605.27178: variable 'ansible_shell_executable' from source: unknown 44071 1727204605.27180: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204605.27182: variable 'ansible_pipelining' from source: unknown 44071 1727204605.27185: variable 'ansible_timeout' from source: unknown 44071 1727204605.27187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204605.27321: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204605.27342: variable 'omit' from source: magic vars 44071 1727204605.27354: starting attempt loop 44071 1727204605.27361: running the handler 44071 1727204605.27495: variable 'lsr_net_profile_fingerprint' from source: set_fact 44071 1727204605.27505: Evaluated conditional (lsr_net_profile_fingerprint): True 44071 1727204605.27514: handler run complete 44071 1727204605.27533: attempt loop complete, returning result 44071 1727204605.27538: _execute() done 44071 1727204605.27545: dumping result to json 44071 1727204605.27551: done dumping result, returning 44071 1727204605.27564: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in statebr [127b8e07-fff9-c964-7471-000000000386] 44071 1727204605.27575: sending task result for task 127b8e07-fff9-c964-7471-000000000386 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204605.27728: no more pending results, returning what we have 44071 1727204605.27731: results queue empty 44071 1727204605.27732: checking for any_errors_fatal 44071 1727204605.27743: done checking for any_errors_fatal 44071 1727204605.27743: checking for max_fail_percentage 44071 1727204605.27745: done checking for max_fail_percentage 44071 1727204605.27745: checking to see if all hosts have failed and the running result is not ok 44071 1727204605.27746: done checking to see if all hosts have failed 44071 1727204605.27747: getting the remaining hosts for this loop 44071 1727204605.27748: done getting the remaining hosts for this loop 44071 1727204605.27754: getting the next task for host managed-node2 44071 1727204605.27764: done getting next task for host managed-node2 44071 1727204605.27770: ^ task is: TASK: Conditional asserts 44071 1727204605.27774: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204605.27779: getting variables 44071 1727204605.27780: in VariableManager get_vars() 44071 1727204605.27814: Calling all_inventory to load vars for managed-node2 44071 1727204605.27817: Calling groups_inventory to load vars for managed-node2 44071 1727204605.27820: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204605.27835: Calling all_plugins_play to load vars for managed-node2 44071 1727204605.27840: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204605.27844: Calling groups_plugins_play to load vars for managed-node2 44071 1727204605.28368: done sending task result for task 127b8e07-fff9-c964-7471-000000000386 44071 1727204605.28375: WORKER PROCESS EXITING 44071 1727204605.35366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204605.37642: done with get_vars() 44071 1727204605.37688: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.139) 0:00:17.694 ***** 44071 1727204605.37787: entering _queue_task() for managed-node2/include_tasks 44071 1727204605.38378: worker is 1 (out of 1 available) 44071 1727204605.38391: exiting _queue_task() for managed-node2/include_tasks 44071 1727204605.38402: done queuing things up, now waiting for results queue to drain 44071 1727204605.38404: waiting for pending results... 44071 1727204605.38535: running TaskExecutor() for managed-node2/TASK: Conditional asserts 44071 1727204605.38740: in run() - task 127b8e07-fff9-c964-7471-000000000097 44071 1727204605.38745: variable 'ansible_search_path' from source: unknown 44071 1727204605.38748: variable 'ansible_search_path' from source: unknown 44071 1727204605.39024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204605.41716: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204605.41809: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204605.41874: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204605.41972: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204605.41975: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204605.42057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204605.42101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204605.42135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204605.42186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204605.42210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204605.42341: variable 'lsr_assert_when' from source: include params 44071 1727204605.42481: variable 'network_provider' from source: set_fact 44071 1727204605.42566: variable 'omit' from source: magic vars 44071 1727204605.42742: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204605.42745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204605.42748: variable 'omit' from source: magic vars 44071 1727204605.42982: variable 'ansible_distribution_major_version' from source: facts 44071 1727204605.42998: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204605.43130: variable 'item' from source: unknown 44071 1727204605.43143: Evaluated conditional (item['condition']): True 44071 1727204605.43237: variable 'item' from source: unknown 44071 1727204605.43285: variable 'item' from source: unknown 44071 1727204605.43391: variable 'item' from source: unknown 44071 1727204605.43712: dumping result to json 44071 1727204605.43716: done dumping result, returning 44071 1727204605.43718: done running TaskExecutor() for managed-node2/TASK: Conditional asserts [127b8e07-fff9-c964-7471-000000000097] 44071 1727204605.43721: sending task result for task 127b8e07-fff9-c964-7471-000000000097 44071 1727204605.43796: no more pending results, returning what we have 44071 1727204605.43802: in VariableManager get_vars() 44071 1727204605.43840: Calling all_inventory to load vars for managed-node2 44071 1727204605.43843: Calling groups_inventory to load vars for managed-node2 44071 1727204605.43847: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204605.43861: Calling all_plugins_play to load vars for managed-node2 44071 1727204605.43864: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204605.43869: Calling groups_plugins_play to load vars for managed-node2 44071 1727204605.44514: done sending task result for task 127b8e07-fff9-c964-7471-000000000097 44071 1727204605.44519: WORKER PROCESS EXITING 44071 1727204605.45975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204605.48594: done with get_vars() 44071 1727204605.48629: variable 'ansible_search_path' from source: unknown 44071 1727204605.48630: variable 'ansible_search_path' from source: unknown 44071 1727204605.48680: we have included files to process 44071 1727204605.48681: generating all_blocks data 44071 1727204605.48683: done generating all_blocks data 44071 1727204605.48688: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 44071 1727204605.48689: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 44071 1727204605.48692: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 44071 1727204605.48950: in VariableManager get_vars() 44071 1727204605.49011: done with get_vars() 44071 1727204605.49172: done processing included file 44071 1727204605.49175: iterating over new_blocks loaded from include file 44071 1727204605.49177: in VariableManager get_vars() 44071 1727204605.49193: done with get_vars() 44071 1727204605.49195: filtering new block on tags 44071 1727204605.49237: done filtering new block on tags 44071 1727204605.49240: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 => (item={'what': 'tasks/assert_device_present.yml', 'condition': True}) 44071 1727204605.49246: extending task lists for all hosts with included blocks 44071 1727204605.50604: done extending task lists 44071 1727204605.50607: done processing included files 44071 1727204605.50607: results queue empty 44071 1727204605.50608: checking for any_errors_fatal 44071 1727204605.50613: done checking for any_errors_fatal 44071 1727204605.50614: checking for max_fail_percentage 44071 1727204605.50615: done checking for max_fail_percentage 44071 1727204605.50616: checking to see if all hosts have failed and the running result is not ok 44071 1727204605.50616: done checking to see if all hosts have failed 44071 1727204605.50617: getting the remaining hosts for this loop 44071 1727204605.50619: done getting the remaining hosts for this loop 44071 1727204605.50622: getting the next task for host managed-node2 44071 1727204605.50627: done getting next task for host managed-node2 44071 1727204605.50629: ^ task is: TASK: Include the task 'get_interface_stat.yml' 44071 1727204605.50632: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204605.50641: getting variables 44071 1727204605.50642: in VariableManager get_vars() 44071 1727204605.50656: Calling all_inventory to load vars for managed-node2 44071 1727204605.50659: Calling groups_inventory to load vars for managed-node2 44071 1727204605.50662: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204605.50671: Calling all_plugins_play to load vars for managed-node2 44071 1727204605.50674: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204605.50677: Calling groups_plugins_play to load vars for managed-node2 44071 1727204605.53284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204605.55670: done with get_vars() 44071 1727204605.55701: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.180) 0:00:17.874 ***** 44071 1727204605.55793: entering _queue_task() for managed-node2/include_tasks 44071 1727204605.56218: worker is 1 (out of 1 available) 44071 1727204605.56231: exiting _queue_task() for managed-node2/include_tasks 44071 1727204605.56247: done queuing things up, now waiting for results queue to drain 44071 1727204605.56249: waiting for pending results... 44071 1727204605.56702: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 44071 1727204605.56851: in run() - task 127b8e07-fff9-c964-7471-000000000452 44071 1727204605.56876: variable 'ansible_search_path' from source: unknown 44071 1727204605.56883: variable 'ansible_search_path' from source: unknown 44071 1727204605.56928: calling self._execute() 44071 1727204605.57055: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204605.57071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204605.57088: variable 'omit' from source: magic vars 44071 1727204605.57513: variable 'ansible_distribution_major_version' from source: facts 44071 1727204605.57534: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204605.57550: _execute() done 44071 1727204605.57775: dumping result to json 44071 1727204605.57779: done dumping result, returning 44071 1727204605.57782: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-c964-7471-000000000452] 44071 1727204605.57785: sending task result for task 127b8e07-fff9-c964-7471-000000000452 44071 1727204605.57864: done sending task result for task 127b8e07-fff9-c964-7471-000000000452 44071 1727204605.57870: WORKER PROCESS EXITING 44071 1727204605.57902: no more pending results, returning what we have 44071 1727204605.57908: in VariableManager get_vars() 44071 1727204605.57948: Calling all_inventory to load vars for managed-node2 44071 1727204605.57951: Calling groups_inventory to load vars for managed-node2 44071 1727204605.57956: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204605.57973: Calling all_plugins_play to load vars for managed-node2 44071 1727204605.57976: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204605.57979: Calling groups_plugins_play to load vars for managed-node2 44071 1727204605.59890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204605.62046: done with get_vars() 44071 1727204605.62082: variable 'ansible_search_path' from source: unknown 44071 1727204605.62083: variable 'ansible_search_path' from source: unknown 44071 1727204605.62238: variable 'item' from source: include params 44071 1727204605.62281: we have included files to process 44071 1727204605.62283: generating all_blocks data 44071 1727204605.62284: done generating all_blocks data 44071 1727204605.62286: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204605.62287: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204605.62290: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204605.62491: done processing included file 44071 1727204605.62493: iterating over new_blocks loaded from include file 44071 1727204605.62495: in VariableManager get_vars() 44071 1727204605.62512: done with get_vars() 44071 1727204605.62514: filtering new block on tags 44071 1727204605.62542: done filtering new block on tags 44071 1727204605.62545: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 44071 1727204605.62550: extending task lists for all hosts with included blocks 44071 1727204605.62705: done extending task lists 44071 1727204605.62706: done processing included files 44071 1727204605.62707: results queue empty 44071 1727204605.62707: checking for any_errors_fatal 44071 1727204605.62711: done checking for any_errors_fatal 44071 1727204605.62712: checking for max_fail_percentage 44071 1727204605.62713: done checking for max_fail_percentage 44071 1727204605.62714: checking to see if all hosts have failed and the running result is not ok 44071 1727204605.62715: done checking to see if all hosts have failed 44071 1727204605.62715: getting the remaining hosts for this loop 44071 1727204605.62716: done getting the remaining hosts for this loop 44071 1727204605.62719: getting the next task for host managed-node2 44071 1727204605.62723: done getting next task for host managed-node2 44071 1727204605.62725: ^ task is: TASK: Get stat for interface {{ interface }} 44071 1727204605.62728: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204605.62730: getting variables 44071 1727204605.62731: in VariableManager get_vars() 44071 1727204605.62741: Calling all_inventory to load vars for managed-node2 44071 1727204605.62743: Calling groups_inventory to load vars for managed-node2 44071 1727204605.62746: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204605.62752: Calling all_plugins_play to load vars for managed-node2 44071 1727204605.62754: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204605.62756: Calling groups_plugins_play to load vars for managed-node2 44071 1727204605.65968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204605.70796: done with get_vars() 44071 1727204605.70841: done getting variables 44071 1727204605.70982: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:03:25 -0400 (0:00:00.152) 0:00:18.026 ***** 44071 1727204605.71016: entering _queue_task() for managed-node2/stat 44071 1727204605.71432: worker is 1 (out of 1 available) 44071 1727204605.71445: exiting _queue_task() for managed-node2/stat 44071 1727204605.71460: done queuing things up, now waiting for results queue to drain 44071 1727204605.71461: waiting for pending results... 44071 1727204605.71807: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 44071 1727204605.71955: in run() - task 127b8e07-fff9-c964-7471-0000000004e8 44071 1727204605.72072: variable 'ansible_search_path' from source: unknown 44071 1727204605.72075: variable 'ansible_search_path' from source: unknown 44071 1727204605.72078: calling self._execute() 44071 1727204605.72151: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204605.72164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204605.72181: variable 'omit' from source: magic vars 44071 1727204605.72613: variable 'ansible_distribution_major_version' from source: facts 44071 1727204605.72638: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204605.72650: variable 'omit' from source: magic vars 44071 1727204605.72716: variable 'omit' from source: magic vars 44071 1727204605.72834: variable 'interface' from source: play vars 44071 1727204605.72863: variable 'omit' from source: magic vars 44071 1727204605.72918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204605.72967: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204605.72996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204605.73068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204605.73072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204605.73076: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204605.73084: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204605.73092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204605.73206: Set connection var ansible_connection to ssh 44071 1727204605.73219: Set connection var ansible_timeout to 10 44071 1727204605.73230: Set connection var ansible_pipelining to False 44071 1727204605.73239: Set connection var ansible_shell_type to sh 44071 1727204605.73249: Set connection var ansible_shell_executable to /bin/sh 44071 1727204605.73261: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204605.73295: variable 'ansible_shell_executable' from source: unknown 44071 1727204605.73372: variable 'ansible_connection' from source: unknown 44071 1727204605.73375: variable 'ansible_module_compression' from source: unknown 44071 1727204605.73377: variable 'ansible_shell_type' from source: unknown 44071 1727204605.73380: variable 'ansible_shell_executable' from source: unknown 44071 1727204605.73383: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204605.73386: variable 'ansible_pipelining' from source: unknown 44071 1727204605.73389: variable 'ansible_timeout' from source: unknown 44071 1727204605.73391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204605.73616: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204605.73621: variable 'omit' from source: magic vars 44071 1727204605.73624: starting attempt loop 44071 1727204605.73627: running the handler 44071 1727204605.73637: _low_level_execute_command(): starting 44071 1727204605.73649: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204605.74426: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204605.74443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204605.74459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204605.74487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204605.74525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204605.74542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204605.74582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204605.74644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204605.74675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204605.74689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204605.74947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204605.76649: stdout chunk (state=3): >>>/root <<< 44071 1727204605.76909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204605.77032: stderr chunk (state=3): >>><<< 44071 1727204605.77043: stdout chunk (state=3): >>><<< 44071 1727204605.77080: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204605.77126: _low_level_execute_command(): starting 44071 1727204605.77272: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204605.7711132-44858-120835558959361 `" && echo ansible-tmp-1727204605.7711132-44858-120835558959361="` echo /root/.ansible/tmp/ansible-tmp-1727204605.7711132-44858-120835558959361 `" ) && sleep 0' 44071 1727204605.78462: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204605.78795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204605.78811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204605.78973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204605.80919: stdout chunk (state=3): >>>ansible-tmp-1727204605.7711132-44858-120835558959361=/root/.ansible/tmp/ansible-tmp-1727204605.7711132-44858-120835558959361 <<< 44071 1727204605.81290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204605.81294: stdout chunk (state=3): >>><<< 44071 1727204605.81300: stderr chunk (state=3): >>><<< 44071 1727204605.81325: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204605.7711132-44858-120835558959361=/root/.ansible/tmp/ansible-tmp-1727204605.7711132-44858-120835558959361 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204605.81392: variable 'ansible_module_compression' from source: unknown 44071 1727204605.81471: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44071 1727204605.81696: variable 'ansible_facts' from source: unknown 44071 1727204605.81769: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204605.7711132-44858-120835558959361/AnsiballZ_stat.py 44071 1727204605.82189: Sending initial data 44071 1727204605.82193: Sent initial data (153 bytes) 44071 1727204605.83492: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204605.83578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204605.83640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204605.85261: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 44071 1727204605.85268: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204605.85430: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204605.85434: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp0kr6y9rl /root/.ansible/tmp/ansible-tmp-1727204605.7711132-44858-120835558959361/AnsiballZ_stat.py <<< 44071 1727204605.85454: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204605.7711132-44858-120835558959361/AnsiballZ_stat.py" <<< 44071 1727204605.85551: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp0kr6y9rl" to remote "/root/.ansible/tmp/ansible-tmp-1727204605.7711132-44858-120835558959361/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204605.7711132-44858-120835558959361/AnsiballZ_stat.py" <<< 44071 1727204605.87110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204605.87114: stderr chunk (state=3): >>><<< 44071 1727204605.87117: stdout chunk (state=3): >>><<< 44071 1727204605.87194: done transferring module to remote 44071 1727204605.87219: _low_level_execute_command(): starting 44071 1727204605.87223: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204605.7711132-44858-120835558959361/ /root/.ansible/tmp/ansible-tmp-1727204605.7711132-44858-120835558959361/AnsiballZ_stat.py && sleep 0' 44071 1727204605.88772: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204605.88778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204605.88796: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204605.88800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204605.88858: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204605.88928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204605.88932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204605.89081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204605.89174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204605.91106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204605.91110: stderr chunk (state=3): >>><<< 44071 1727204605.91113: stdout chunk (state=3): >>><<< 44071 1727204605.91172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204605.91181: _low_level_execute_command(): starting 44071 1727204605.91184: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204605.7711132-44858-120835558959361/AnsiballZ_stat.py && sleep 0' 44071 1727204605.92510: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204605.92516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204605.92519: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204605.92521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204605.92772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204605.92792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204605.92902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204606.09691: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 37343, "dev": 23, "nlink": 1, "atime": 1727204602.3286533, "mtime": 1727204602.3286533, "ctime": 1727204602.3286533, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44071 1727204606.11292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204606.11300: stderr chunk (state=3): >>><<< 44071 1727204606.11311: stdout chunk (state=3): >>><<< 44071 1727204606.11333: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 37343, "dev": 23, "nlink": 1, "atime": 1727204602.3286533, "mtime": 1727204602.3286533, "ctime": 1727204602.3286533, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204606.11397: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204605.7711132-44858-120835558959361/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204606.11575: _low_level_execute_command(): starting 44071 1727204606.11579: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204605.7711132-44858-120835558959361/ > /dev/null 2>&1 && sleep 0' 44071 1727204606.12694: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204606.12894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204606.13011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204606.13077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204606.15207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204606.15231: stderr chunk (state=3): >>><<< 44071 1727204606.15235: stdout chunk (state=3): >>><<< 44071 1727204606.15258: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204606.15268: handler run complete 44071 1727204606.15509: attempt loop complete, returning result 44071 1727204606.15514: _execute() done 44071 1727204606.15524: dumping result to json 44071 1727204606.15532: done dumping result, returning 44071 1727204606.15543: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [127b8e07-fff9-c964-7471-0000000004e8] 44071 1727204606.15546: sending task result for task 127b8e07-fff9-c964-7471-0000000004e8 44071 1727204606.15687: done sending task result for task 127b8e07-fff9-c964-7471-0000000004e8 44071 1727204606.15690: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204602.3286533, "block_size": 4096, "blocks": 0, "ctime": 1727204602.3286533, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 37343, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1727204602.3286533, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 44071 1727204606.15886: no more pending results, returning what we have 44071 1727204606.15890: results queue empty 44071 1727204606.15891: checking for any_errors_fatal 44071 1727204606.15893: done checking for any_errors_fatal 44071 1727204606.15894: checking for max_fail_percentage 44071 1727204606.15895: done checking for max_fail_percentage 44071 1727204606.15896: checking to see if all hosts have failed and the running result is not ok 44071 1727204606.15897: done checking to see if all hosts have failed 44071 1727204606.15897: getting the remaining hosts for this loop 44071 1727204606.15899: done getting the remaining hosts for this loop 44071 1727204606.15906: getting the next task for host managed-node2 44071 1727204606.15917: done getting next task for host managed-node2 44071 1727204606.15920: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 44071 1727204606.15924: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204606.15931: getting variables 44071 1727204606.15933: in VariableManager get_vars() 44071 1727204606.16372: Calling all_inventory to load vars for managed-node2 44071 1727204606.16377: Calling groups_inventory to load vars for managed-node2 44071 1727204606.16381: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204606.16394: Calling all_plugins_play to load vars for managed-node2 44071 1727204606.16397: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204606.16401: Calling groups_plugins_play to load vars for managed-node2 44071 1727204606.20187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204606.25325: done with get_vars() 44071 1727204606.25368: done getting variables 44071 1727204606.25438: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204606.25894: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.549) 0:00:18.575 ***** 44071 1727204606.25932: entering _queue_task() for managed-node2/assert 44071 1727204606.26657: worker is 1 (out of 1 available) 44071 1727204606.26879: exiting _queue_task() for managed-node2/assert 44071 1727204606.26895: done queuing things up, now waiting for results queue to drain 44071 1727204606.26896: waiting for pending results... 44071 1727204606.27668: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'statebr' 44071 1727204606.27675: in run() - task 127b8e07-fff9-c964-7471-000000000453 44071 1727204606.27679: variable 'ansible_search_path' from source: unknown 44071 1727204606.27682: variable 'ansible_search_path' from source: unknown 44071 1727204606.27742: calling self._execute() 44071 1727204606.28473: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204606.28478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204606.28481: variable 'omit' from source: magic vars 44071 1727204606.29526: variable 'ansible_distribution_major_version' from source: facts 44071 1727204606.29551: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204606.29562: variable 'omit' from source: magic vars 44071 1727204606.29615: variable 'omit' from source: magic vars 44071 1727204606.30174: variable 'interface' from source: play vars 44071 1727204606.30179: variable 'omit' from source: magic vars 44071 1727204606.30182: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204606.30184: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204606.30187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204606.30190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204606.30573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204606.30578: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204606.30581: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204606.30583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204606.30585: Set connection var ansible_connection to ssh 44071 1727204606.30588: Set connection var ansible_timeout to 10 44071 1727204606.30590: Set connection var ansible_pipelining to False 44071 1727204606.30592: Set connection var ansible_shell_type to sh 44071 1727204606.30780: Set connection var ansible_shell_executable to /bin/sh 44071 1727204606.30796: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204606.30830: variable 'ansible_shell_executable' from source: unknown 44071 1727204606.30841: variable 'ansible_connection' from source: unknown 44071 1727204606.30850: variable 'ansible_module_compression' from source: unknown 44071 1727204606.30858: variable 'ansible_shell_type' from source: unknown 44071 1727204606.30866: variable 'ansible_shell_executable' from source: unknown 44071 1727204606.30875: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204606.30883: variable 'ansible_pipelining' from source: unknown 44071 1727204606.30890: variable 'ansible_timeout' from source: unknown 44071 1727204606.30898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204606.31273: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204606.31297: variable 'omit' from source: magic vars 44071 1727204606.31309: starting attempt loop 44071 1727204606.31316: running the handler 44071 1727204606.31700: variable 'interface_stat' from source: set_fact 44071 1727204606.31733: Evaluated conditional (interface_stat.stat.exists): True 44071 1727204606.31750: handler run complete 44071 1727204606.32071: attempt loop complete, returning result 44071 1727204606.32075: _execute() done 44071 1727204606.32077: dumping result to json 44071 1727204606.32080: done dumping result, returning 44071 1727204606.32082: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'statebr' [127b8e07-fff9-c964-7471-000000000453] 44071 1727204606.32085: sending task result for task 127b8e07-fff9-c964-7471-000000000453 44071 1727204606.32174: done sending task result for task 127b8e07-fff9-c964-7471-000000000453 44071 1727204606.32178: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204606.32232: no more pending results, returning what we have 44071 1727204606.32236: results queue empty 44071 1727204606.32237: checking for any_errors_fatal 44071 1727204606.32249: done checking for any_errors_fatal 44071 1727204606.32250: checking for max_fail_percentage 44071 1727204606.32251: done checking for max_fail_percentage 44071 1727204606.32252: checking to see if all hosts have failed and the running result is not ok 44071 1727204606.32252: done checking to see if all hosts have failed 44071 1727204606.32253: getting the remaining hosts for this loop 44071 1727204606.32255: done getting the remaining hosts for this loop 44071 1727204606.32260: getting the next task for host managed-node2 44071 1727204606.32271: done getting next task for host managed-node2 44071 1727204606.32273: ^ task is: TASK: Success in test '{{ lsr_description }}' 44071 1727204606.32277: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204606.32281: getting variables 44071 1727204606.32283: in VariableManager get_vars() 44071 1727204606.32317: Calling all_inventory to load vars for managed-node2 44071 1727204606.32319: Calling groups_inventory to load vars for managed-node2 44071 1727204606.32323: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204606.32336: Calling all_plugins_play to load vars for managed-node2 44071 1727204606.32339: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204606.32342: Calling groups_plugins_play to load vars for managed-node2 44071 1727204606.35937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204606.41564: done with get_vars() 44071 1727204606.41811: done getting variables 44071 1727204606.41884: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204606.42214: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile'] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.163) 0:00:18.738 ***** 44071 1727204606.42248: entering _queue_task() for managed-node2/debug 44071 1727204606.43051: worker is 1 (out of 1 available) 44071 1727204606.43070: exiting _queue_task() for managed-node2/debug 44071 1727204606.43085: done queuing things up, now waiting for results queue to drain 44071 1727204606.43087: waiting for pending results... 44071 1727204606.44026: running TaskExecutor() for managed-node2/TASK: Success in test 'I can create a profile' 44071 1727204606.44364: in run() - task 127b8e07-fff9-c964-7471-000000000098 44071 1727204606.44574: variable 'ansible_search_path' from source: unknown 44071 1727204606.44578: variable 'ansible_search_path' from source: unknown 44071 1727204606.44581: calling self._execute() 44071 1727204606.44873: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204606.44877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204606.44881: variable 'omit' from source: magic vars 44071 1727204606.45156: variable 'ansible_distribution_major_version' from source: facts 44071 1727204606.45171: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204606.45179: variable 'omit' from source: magic vars 44071 1727204606.45231: variable 'omit' from source: magic vars 44071 1727204606.45347: variable 'lsr_description' from source: include params 44071 1727204606.45369: variable 'omit' from source: magic vars 44071 1727204606.45413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204606.45458: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204606.45482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204606.45501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204606.45515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204606.45552: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204606.45556: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204606.45560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204606.45675: Set connection var ansible_connection to ssh 44071 1727204606.45683: Set connection var ansible_timeout to 10 44071 1727204606.45689: Set connection var ansible_pipelining to False 44071 1727204606.45695: Set connection var ansible_shell_type to sh 44071 1727204606.45702: Set connection var ansible_shell_executable to /bin/sh 44071 1727204606.45709: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204606.45735: variable 'ansible_shell_executable' from source: unknown 44071 1727204606.45739: variable 'ansible_connection' from source: unknown 44071 1727204606.45744: variable 'ansible_module_compression' from source: unknown 44071 1727204606.45747: variable 'ansible_shell_type' from source: unknown 44071 1727204606.45750: variable 'ansible_shell_executable' from source: unknown 44071 1727204606.45754: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204606.45756: variable 'ansible_pipelining' from source: unknown 44071 1727204606.45761: variable 'ansible_timeout' from source: unknown 44071 1727204606.45764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204606.45984: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204606.45989: variable 'omit' from source: magic vars 44071 1727204606.45991: starting attempt loop 44071 1727204606.45994: running the handler 44071 1727204606.45997: handler run complete 44071 1727204606.46009: attempt loop complete, returning result 44071 1727204606.46012: _execute() done 44071 1727204606.46014: dumping result to json 44071 1727204606.46019: done dumping result, returning 44071 1727204606.46028: done running TaskExecutor() for managed-node2/TASK: Success in test 'I can create a profile' [127b8e07-fff9-c964-7471-000000000098] 44071 1727204606.46033: sending task result for task 127b8e07-fff9-c964-7471-000000000098 44071 1727204606.46429: done sending task result for task 127b8e07-fff9-c964-7471-000000000098 44071 1727204606.46433: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: +++++ Success in test 'I can create a profile' +++++ 44071 1727204606.46481: no more pending results, returning what we have 44071 1727204606.46484: results queue empty 44071 1727204606.46485: checking for any_errors_fatal 44071 1727204606.46492: done checking for any_errors_fatal 44071 1727204606.46493: checking for max_fail_percentage 44071 1727204606.46495: done checking for max_fail_percentage 44071 1727204606.46496: checking to see if all hosts have failed and the running result is not ok 44071 1727204606.46496: done checking to see if all hosts have failed 44071 1727204606.46497: getting the remaining hosts for this loop 44071 1727204606.46498: done getting the remaining hosts for this loop 44071 1727204606.46502: getting the next task for host managed-node2 44071 1727204606.46510: done getting next task for host managed-node2 44071 1727204606.46513: ^ task is: TASK: Cleanup 44071 1727204606.46516: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204606.46521: getting variables 44071 1727204606.46522: in VariableManager get_vars() 44071 1727204606.46552: Calling all_inventory to load vars for managed-node2 44071 1727204606.46555: Calling groups_inventory to load vars for managed-node2 44071 1727204606.46558: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204606.46572: Calling all_plugins_play to load vars for managed-node2 44071 1727204606.46576: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204606.46579: Calling groups_plugins_play to load vars for managed-node2 44071 1727204606.50500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204606.54923: done with get_vars() 44071 1727204606.54969: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.130) 0:00:18.869 ***** 44071 1727204606.55283: entering _queue_task() for managed-node2/include_tasks 44071 1727204606.56114: worker is 1 (out of 1 available) 44071 1727204606.56129: exiting _queue_task() for managed-node2/include_tasks 44071 1727204606.56142: done queuing things up, now waiting for results queue to drain 44071 1727204606.56144: waiting for pending results... 44071 1727204606.56643: running TaskExecutor() for managed-node2/TASK: Cleanup 44071 1727204606.56795: in run() - task 127b8e07-fff9-c964-7471-00000000009c 44071 1727204606.56976: variable 'ansible_search_path' from source: unknown 44071 1727204606.56985: variable 'ansible_search_path' from source: unknown 44071 1727204606.57087: variable 'lsr_cleanup' from source: include params 44071 1727204606.57547: variable 'lsr_cleanup' from source: include params 44071 1727204606.57857: variable 'omit' from source: magic vars 44071 1727204606.58060: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204606.58199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204606.58216: variable 'omit' from source: magic vars 44071 1727204606.59058: variable 'ansible_distribution_major_version' from source: facts 44071 1727204606.59062: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204606.59068: variable 'item' from source: unknown 44071 1727204606.59385: variable 'item' from source: unknown 44071 1727204606.59389: variable 'item' from source: unknown 44071 1727204606.59530: variable 'item' from source: unknown 44071 1727204606.60193: dumping result to json 44071 1727204606.60197: done dumping result, returning 44071 1727204606.60200: done running TaskExecutor() for managed-node2/TASK: Cleanup [127b8e07-fff9-c964-7471-00000000009c] 44071 1727204606.60203: sending task result for task 127b8e07-fff9-c964-7471-00000000009c 44071 1727204606.60267: done sending task result for task 127b8e07-fff9-c964-7471-00000000009c 44071 1727204606.60272: WORKER PROCESS EXITING 44071 1727204606.60302: no more pending results, returning what we have 44071 1727204606.60308: in VariableManager get_vars() 44071 1727204606.60348: Calling all_inventory to load vars for managed-node2 44071 1727204606.60351: Calling groups_inventory to load vars for managed-node2 44071 1727204606.60355: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204606.60474: Calling all_plugins_play to load vars for managed-node2 44071 1727204606.60479: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204606.60484: Calling groups_plugins_play to load vars for managed-node2 44071 1727204606.65282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204606.70197: done with get_vars() 44071 1727204606.70232: variable 'ansible_search_path' from source: unknown 44071 1727204606.70233: variable 'ansible_search_path' from source: unknown 44071 1727204606.70300: we have included files to process 44071 1727204606.70302: generating all_blocks data 44071 1727204606.70304: done generating all_blocks data 44071 1727204606.70310: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204606.70312: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204606.70314: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204606.70633: done processing included file 44071 1727204606.70635: iterating over new_blocks loaded from include file 44071 1727204606.70637: in VariableManager get_vars() 44071 1727204606.70656: done with get_vars() 44071 1727204606.70658: filtering new block on tags 44071 1727204606.70691: done filtering new block on tags 44071 1727204606.70694: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed-node2 => (item=tasks/cleanup_profile+device.yml) 44071 1727204606.70700: extending task lists for all hosts with included blocks 44071 1727204606.72760: done extending task lists 44071 1727204606.72762: done processing included files 44071 1727204606.72763: results queue empty 44071 1727204606.72764: checking for any_errors_fatal 44071 1727204606.72770: done checking for any_errors_fatal 44071 1727204606.72771: checking for max_fail_percentage 44071 1727204606.72773: done checking for max_fail_percentage 44071 1727204606.72773: checking to see if all hosts have failed and the running result is not ok 44071 1727204606.72774: done checking to see if all hosts have failed 44071 1727204606.72775: getting the remaining hosts for this loop 44071 1727204606.72776: done getting the remaining hosts for this loop 44071 1727204606.72779: getting the next task for host managed-node2 44071 1727204606.72785: done getting next task for host managed-node2 44071 1727204606.72787: ^ task is: TASK: Cleanup profile and device 44071 1727204606.72791: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204606.72799: getting variables 44071 1727204606.72800: in VariableManager get_vars() 44071 1727204606.72817: Calling all_inventory to load vars for managed-node2 44071 1727204606.72821: Calling groups_inventory to load vars for managed-node2 44071 1727204606.72824: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204606.72831: Calling all_plugins_play to load vars for managed-node2 44071 1727204606.72834: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204606.72836: Calling groups_plugins_play to load vars for managed-node2 44071 1727204606.74706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204606.77353: done with get_vars() 44071 1727204606.77429: done getting variables 44071 1727204606.77486: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.223) 0:00:19.092 ***** 44071 1727204606.77633: entering _queue_task() for managed-node2/shell 44071 1727204606.78367: worker is 1 (out of 1 available) 44071 1727204606.78495: exiting _queue_task() for managed-node2/shell 44071 1727204606.78508: done queuing things up, now waiting for results queue to drain 44071 1727204606.78509: waiting for pending results... 44071 1727204606.78833: running TaskExecutor() for managed-node2/TASK: Cleanup profile and device 44071 1727204606.78905: in run() - task 127b8e07-fff9-c964-7471-00000000050b 44071 1727204606.78953: variable 'ansible_search_path' from source: unknown 44071 1727204606.78956: variable 'ansible_search_path' from source: unknown 44071 1727204606.78989: calling self._execute() 44071 1727204606.79146: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204606.79151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204606.79154: variable 'omit' from source: magic vars 44071 1727204606.79569: variable 'ansible_distribution_major_version' from source: facts 44071 1727204606.79593: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204606.79604: variable 'omit' from source: magic vars 44071 1727204606.79660: variable 'omit' from source: magic vars 44071 1727204606.79850: variable 'interface' from source: play vars 44071 1727204606.79880: variable 'omit' from source: magic vars 44071 1727204606.79940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204606.79986: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204606.80103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204606.80106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204606.80109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204606.80115: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204606.80127: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204606.80134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204606.80254: Set connection var ansible_connection to ssh 44071 1727204606.80266: Set connection var ansible_timeout to 10 44071 1727204606.80277: Set connection var ansible_pipelining to False 44071 1727204606.80288: Set connection var ansible_shell_type to sh 44071 1727204606.80299: Set connection var ansible_shell_executable to /bin/sh 44071 1727204606.80331: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204606.80356: variable 'ansible_shell_executable' from source: unknown 44071 1727204606.80368: variable 'ansible_connection' from source: unknown 44071 1727204606.80396: variable 'ansible_module_compression' from source: unknown 44071 1727204606.80399: variable 'ansible_shell_type' from source: unknown 44071 1727204606.80402: variable 'ansible_shell_executable' from source: unknown 44071 1727204606.80404: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204606.80442: variable 'ansible_pipelining' from source: unknown 44071 1727204606.80445: variable 'ansible_timeout' from source: unknown 44071 1727204606.80452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204606.80621: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204606.80657: variable 'omit' from source: magic vars 44071 1727204606.80660: starting attempt loop 44071 1727204606.80672: running the handler 44071 1727204606.80765: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204606.80769: _low_level_execute_command(): starting 44071 1727204606.80773: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204606.81577: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204606.81671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204606.81722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204606.81743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204606.81768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204606.81880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204606.83716: stdout chunk (state=3): >>>/root <<< 44071 1727204606.83845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204606.83849: stdout chunk (state=3): >>><<< 44071 1727204606.83869: stderr chunk (state=3): >>><<< 44071 1727204606.84035: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204606.84073: _low_level_execute_command(): starting 44071 1727204606.84088: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204606.8404825-44972-117171845999748 `" && echo ansible-tmp-1727204606.8404825-44972-117171845999748="` echo /root/.ansible/tmp/ansible-tmp-1727204606.8404825-44972-117171845999748 `" ) && sleep 0' 44071 1727204606.85212: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204606.85244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204606.85510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204606.87489: stdout chunk (state=3): >>>ansible-tmp-1727204606.8404825-44972-117171845999748=/root/.ansible/tmp/ansible-tmp-1727204606.8404825-44972-117171845999748 <<< 44071 1727204606.87690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204606.87709: stdout chunk (state=3): >>><<< 44071 1727204606.87723: stderr chunk (state=3): >>><<< 44071 1727204606.87752: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204606.8404825-44972-117171845999748=/root/.ansible/tmp/ansible-tmp-1727204606.8404825-44972-117171845999748 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204606.87804: variable 'ansible_module_compression' from source: unknown 44071 1727204606.87875: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204606.87933: variable 'ansible_facts' from source: unknown 44071 1727204606.88017: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204606.8404825-44972-117171845999748/AnsiballZ_command.py 44071 1727204606.88276: Sending initial data 44071 1727204606.88279: Sent initial data (156 bytes) 44071 1727204606.89049: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204606.89126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204606.89156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204606.89193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204606.89453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204606.90994: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204606.91021: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204606.91112: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204606.91213: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpa6y6korb /root/.ansible/tmp/ansible-tmp-1727204606.8404825-44972-117171845999748/AnsiballZ_command.py <<< 44071 1727204606.91225: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204606.8404825-44972-117171845999748/AnsiballZ_command.py" <<< 44071 1727204606.91296: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpa6y6korb" to remote "/root/.ansible/tmp/ansible-tmp-1727204606.8404825-44972-117171845999748/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204606.8404825-44972-117171845999748/AnsiballZ_command.py" <<< 44071 1727204606.98154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204606.98270: stderr chunk (state=3): >>><<< 44071 1727204606.98274: stdout chunk (state=3): >>><<< 44071 1727204606.98277: done transferring module to remote 44071 1727204606.98339: _low_level_execute_command(): starting 44071 1727204606.98399: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204606.8404825-44972-117171845999748/ /root/.ansible/tmp/ansible-tmp-1727204606.8404825-44972-117171845999748/AnsiballZ_command.py && sleep 0' 44071 1727204607.00136: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204607.00416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204607.00526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204607.00610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204607.02505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204607.02729: stderr chunk (state=3): >>><<< 44071 1727204607.02733: stdout chunk (state=3): >>><<< 44071 1727204607.02759: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204607.02943: _low_level_execute_command(): starting 44071 1727204607.02947: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204606.8404825-44972-117171845999748/AnsiballZ_command.py && sleep 0' 44071 1727204607.04732: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204607.04816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204607.04871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204607.04905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204607.04982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204607.28235: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (f7d54dee-54f0-42d3-8296-dcee7d3104de) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:03:27.213584", "end": "2024-09-24 15:03:27.275794", "delta": "0:00:00.062210", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204607.29886: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. <<< 44071 1727204607.29890: stdout chunk (state=3): >>><<< 44071 1727204607.29893: stderr chunk (state=3): >>><<< 44071 1727204607.29896: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (f7d54dee-54f0-42d3-8296-dcee7d3104de) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:03:27.213584", "end": "2024-09-24 15:03:27.275794", "delta": "0:00:00.062210", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. 44071 1727204607.29899: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204606.8404825-44972-117171845999748/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204607.29903: _low_level_execute_command(): starting 44071 1727204607.29906: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204606.8404825-44972-117171845999748/ > /dev/null 2>&1 && sleep 0' 44071 1727204607.31326: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204607.31336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204607.31489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204607.31538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204607.33743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204607.33747: stderr chunk (state=3): >>><<< 44071 1727204607.33750: stdout chunk (state=3): >>><<< 44071 1727204607.33870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204607.33880: handler run complete 44071 1727204607.33903: Evaluated conditional (False): False 44071 1727204607.33913: attempt loop complete, returning result 44071 1727204607.33916: _execute() done 44071 1727204607.33920: dumping result to json 44071 1727204607.33926: done dumping result, returning 44071 1727204607.33936: done running TaskExecutor() for managed-node2/TASK: Cleanup profile and device [127b8e07-fff9-c964-7471-00000000050b] 44071 1727204607.33943: sending task result for task 127b8e07-fff9-c964-7471-00000000050b 44071 1727204607.34364: done sending task result for task 127b8e07-fff9-c964-7471-00000000050b 44071 1727204607.34370: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.062210", "end": "2024-09-24 15:03:27.275794", "rc": 1, "start": "2024-09-24 15:03:27.213584" } STDOUT: Connection 'statebr' (f7d54dee-54f0-42d3-8296-dcee7d3104de) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 44071 1727204607.34453: no more pending results, returning what we have 44071 1727204607.34458: results queue empty 44071 1727204607.34459: checking for any_errors_fatal 44071 1727204607.34461: done checking for any_errors_fatal 44071 1727204607.34462: checking for max_fail_percentage 44071 1727204607.34463: done checking for max_fail_percentage 44071 1727204607.34464: checking to see if all hosts have failed and the running result is not ok 44071 1727204607.34465: done checking to see if all hosts have failed 44071 1727204607.34468: getting the remaining hosts for this loop 44071 1727204607.34470: done getting the remaining hosts for this loop 44071 1727204607.34476: getting the next task for host managed-node2 44071 1727204607.34488: done getting next task for host managed-node2 44071 1727204607.34491: ^ task is: TASK: Include the task 'run_test.yml' 44071 1727204607.34493: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204607.34497: getting variables 44071 1727204607.34499: in VariableManager get_vars() 44071 1727204607.34534: Calling all_inventory to load vars for managed-node2 44071 1727204607.34537: Calling groups_inventory to load vars for managed-node2 44071 1727204607.34543: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204607.34558: Calling all_plugins_play to load vars for managed-node2 44071 1727204607.34561: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204607.34564: Calling groups_plugins_play to load vars for managed-node2 44071 1727204607.39481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204607.44923: done with get_vars() 44071 1727204607.45245: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:45 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.678) 0:00:19.771 ***** 44071 1727204607.45523: entering _queue_task() for managed-node2/include_tasks 44071 1727204607.46247: worker is 1 (out of 1 available) 44071 1727204607.46260: exiting _queue_task() for managed-node2/include_tasks 44071 1727204607.46686: done queuing things up, now waiting for results queue to drain 44071 1727204607.46689: waiting for pending results... 44071 1727204607.47477: running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' 44071 1727204607.47485: in run() - task 127b8e07-fff9-c964-7471-00000000000f 44071 1727204607.47673: variable 'ansible_search_path' from source: unknown 44071 1727204607.47679: calling self._execute() 44071 1727204607.47682: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.47685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.47688: variable 'omit' from source: magic vars 44071 1727204607.48533: variable 'ansible_distribution_major_version' from source: facts 44071 1727204607.48551: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204607.48558: _execute() done 44071 1727204607.48568: dumping result to json 44071 1727204607.48572: done dumping result, returning 44071 1727204607.48575: done running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' [127b8e07-fff9-c964-7471-00000000000f] 44071 1727204607.48581: sending task result for task 127b8e07-fff9-c964-7471-00000000000f 44071 1727204607.48899: no more pending results, returning what we have 44071 1727204607.48905: in VariableManager get_vars() 44071 1727204607.48948: Calling all_inventory to load vars for managed-node2 44071 1727204607.48951: Calling groups_inventory to load vars for managed-node2 44071 1727204607.48955: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204607.48972: Calling all_plugins_play to load vars for managed-node2 44071 1727204607.48976: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204607.48979: Calling groups_plugins_play to load vars for managed-node2 44071 1727204607.49511: done sending task result for task 127b8e07-fff9-c964-7471-00000000000f 44071 1727204607.49514: WORKER PROCESS EXITING 44071 1727204607.53229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204607.57685: done with get_vars() 44071 1727204607.57726: variable 'ansible_search_path' from source: unknown 44071 1727204607.57748: we have included files to process 44071 1727204607.57749: generating all_blocks data 44071 1727204607.57751: done generating all_blocks data 44071 1727204607.57970: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204607.57972: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204607.57977: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204607.58723: in VariableManager get_vars() 44071 1727204607.58747: done with get_vars() 44071 1727204607.59000: in VariableManager get_vars() 44071 1727204607.59020: done with get_vars() 44071 1727204607.59070: in VariableManager get_vars() 44071 1727204607.59088: done with get_vars() 44071 1727204607.59130: in VariableManager get_vars() 44071 1727204607.59150: done with get_vars() 44071 1727204607.59386: in VariableManager get_vars() 44071 1727204607.59403: done with get_vars() 44071 1727204607.60410: in VariableManager get_vars() 44071 1727204607.60430: done with get_vars() 44071 1727204607.60444: done processing included file 44071 1727204607.60446: iterating over new_blocks loaded from include file 44071 1727204607.60447: in VariableManager get_vars() 44071 1727204607.60458: done with get_vars() 44071 1727204607.60459: filtering new block on tags 44071 1727204607.60787: done filtering new block on tags 44071 1727204607.60791: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node2 44071 1727204607.60797: extending task lists for all hosts with included blocks 44071 1727204607.60840: done extending task lists 44071 1727204607.60842: done processing included files 44071 1727204607.60843: results queue empty 44071 1727204607.60843: checking for any_errors_fatal 44071 1727204607.60850: done checking for any_errors_fatal 44071 1727204607.60851: checking for max_fail_percentage 44071 1727204607.60852: done checking for max_fail_percentage 44071 1727204607.60853: checking to see if all hosts have failed and the running result is not ok 44071 1727204607.60853: done checking to see if all hosts have failed 44071 1727204607.60854: getting the remaining hosts for this loop 44071 1727204607.60855: done getting the remaining hosts for this loop 44071 1727204607.60858: getting the next task for host managed-node2 44071 1727204607.60862: done getting next task for host managed-node2 44071 1727204607.60864: ^ task is: TASK: TEST: {{ lsr_description }} 44071 1727204607.61071: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204607.61075: getting variables 44071 1727204607.61077: in VariableManager get_vars() 44071 1727204607.61089: Calling all_inventory to load vars for managed-node2 44071 1727204607.61091: Calling groups_inventory to load vars for managed-node2 44071 1727204607.61094: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204607.61101: Calling all_plugins_play to load vars for managed-node2 44071 1727204607.61103: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204607.61106: Calling groups_plugins_play to load vars for managed-node2 44071 1727204607.64373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204607.69257: done with get_vars() 44071 1727204607.69499: done getting variables 44071 1727204607.69558: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204607.69694: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile without autoconnect] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.242) 0:00:20.013 ***** 44071 1727204607.69727: entering _queue_task() for managed-node2/debug 44071 1727204607.70643: worker is 1 (out of 1 available) 44071 1727204607.70660: exiting _queue_task() for managed-node2/debug 44071 1727204607.70975: done queuing things up, now waiting for results queue to drain 44071 1727204607.70977: waiting for pending results... 44071 1727204607.71387: running TaskExecutor() for managed-node2/TASK: TEST: I can create a profile without autoconnect 44071 1727204607.71578: in run() - task 127b8e07-fff9-c964-7471-0000000005b4 44071 1727204607.71692: variable 'ansible_search_path' from source: unknown 44071 1727204607.71701: variable 'ansible_search_path' from source: unknown 44071 1727204607.71755: calling self._execute() 44071 1727204607.72157: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.72162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.72167: variable 'omit' from source: magic vars 44071 1727204607.73021: variable 'ansible_distribution_major_version' from source: facts 44071 1727204607.73454: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204607.73458: variable 'omit' from source: magic vars 44071 1727204607.73462: variable 'omit' from source: magic vars 44071 1727204607.73513: variable 'lsr_description' from source: include params 44071 1727204607.73636: variable 'omit' from source: magic vars 44071 1727204607.73702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204607.73814: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204607.73868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204607.73973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204607.73991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204607.74032: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204607.74079: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.74087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.74399: Set connection var ansible_connection to ssh 44071 1727204607.74415: Set connection var ansible_timeout to 10 44071 1727204607.74426: Set connection var ansible_pipelining to False 44071 1727204607.74436: Set connection var ansible_shell_type to sh 44071 1727204607.74449: Set connection var ansible_shell_executable to /bin/sh 44071 1727204607.74461: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204607.74519: variable 'ansible_shell_executable' from source: unknown 44071 1727204607.74578: variable 'ansible_connection' from source: unknown 44071 1727204607.74586: variable 'ansible_module_compression' from source: unknown 44071 1727204607.74595: variable 'ansible_shell_type' from source: unknown 44071 1727204607.74606: variable 'ansible_shell_executable' from source: unknown 44071 1727204607.74613: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.74620: variable 'ansible_pipelining' from source: unknown 44071 1727204607.74627: variable 'ansible_timeout' from source: unknown 44071 1727204607.74634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.75149: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204607.75154: variable 'omit' from source: magic vars 44071 1727204607.75156: starting attempt loop 44071 1727204607.75159: running the handler 44071 1727204607.75161: handler run complete 44071 1727204607.75163: attempt loop complete, returning result 44071 1727204607.75253: _execute() done 44071 1727204607.75258: dumping result to json 44071 1727204607.75261: done dumping result, returning 44071 1727204607.75373: done running TaskExecutor() for managed-node2/TASK: TEST: I can create a profile without autoconnect [127b8e07-fff9-c964-7471-0000000005b4] 44071 1727204607.75376: sending task result for task 127b8e07-fff9-c964-7471-0000000005b4 44071 1727204607.75457: done sending task result for task 127b8e07-fff9-c964-7471-0000000005b4 44071 1727204607.75460: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ########## I can create a profile without autoconnect ########## 44071 1727204607.75533: no more pending results, returning what we have 44071 1727204607.75541: results queue empty 44071 1727204607.75543: checking for any_errors_fatal 44071 1727204607.75546: done checking for any_errors_fatal 44071 1727204607.75546: checking for max_fail_percentage 44071 1727204607.75548: done checking for max_fail_percentage 44071 1727204607.75549: checking to see if all hosts have failed and the running result is not ok 44071 1727204607.75550: done checking to see if all hosts have failed 44071 1727204607.75550: getting the remaining hosts for this loop 44071 1727204607.75552: done getting the remaining hosts for this loop 44071 1727204607.75558: getting the next task for host managed-node2 44071 1727204607.75570: done getting next task for host managed-node2 44071 1727204607.75574: ^ task is: TASK: Show item 44071 1727204607.75588: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204607.75594: getting variables 44071 1727204607.75595: in VariableManager get_vars() 44071 1727204607.75632: Calling all_inventory to load vars for managed-node2 44071 1727204607.75635: Calling groups_inventory to load vars for managed-node2 44071 1727204607.75641: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204607.75656: Calling all_plugins_play to load vars for managed-node2 44071 1727204607.75659: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204607.75662: Calling groups_plugins_play to load vars for managed-node2 44071 1727204607.79534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204607.84112: done with get_vars() 44071 1727204607.84159: done getting variables 44071 1727204607.84232: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.145) 0:00:20.159 ***** 44071 1727204607.84479: entering _queue_task() for managed-node2/debug 44071 1727204607.85445: worker is 1 (out of 1 available) 44071 1727204607.85460: exiting _queue_task() for managed-node2/debug 44071 1727204607.85477: done queuing things up, now waiting for results queue to drain 44071 1727204607.85479: waiting for pending results... 44071 1727204607.86290: running TaskExecutor() for managed-node2/TASK: Show item 44071 1727204607.86329: in run() - task 127b8e07-fff9-c964-7471-0000000005b5 44071 1727204607.86355: variable 'ansible_search_path' from source: unknown 44071 1727204607.86564: variable 'ansible_search_path' from source: unknown 44071 1727204607.86575: variable 'omit' from source: magic vars 44071 1727204607.87048: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.87052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.87055: variable 'omit' from source: magic vars 44071 1727204607.87855: variable 'ansible_distribution_major_version' from source: facts 44071 1727204607.87880: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204607.87921: variable 'omit' from source: magic vars 44071 1727204607.87976: variable 'omit' from source: magic vars 44071 1727204607.88375: variable 'item' from source: unknown 44071 1727204607.88429: variable 'item' from source: unknown 44071 1727204607.88458: variable 'omit' from source: magic vars 44071 1727204607.88514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204607.88561: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204607.88725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204607.88754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204607.88778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204607.88823: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204607.88833: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.88877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.89177: Set connection var ansible_connection to ssh 44071 1727204607.89191: Set connection var ansible_timeout to 10 44071 1727204607.89203: Set connection var ansible_pipelining to False 44071 1727204607.89213: Set connection var ansible_shell_type to sh 44071 1727204607.89225: Set connection var ansible_shell_executable to /bin/sh 44071 1727204607.89241: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204607.89467: variable 'ansible_shell_executable' from source: unknown 44071 1727204607.89472: variable 'ansible_connection' from source: unknown 44071 1727204607.89476: variable 'ansible_module_compression' from source: unknown 44071 1727204607.89478: variable 'ansible_shell_type' from source: unknown 44071 1727204607.89481: variable 'ansible_shell_executable' from source: unknown 44071 1727204607.89483: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.89485: variable 'ansible_pipelining' from source: unknown 44071 1727204607.89487: variable 'ansible_timeout' from source: unknown 44071 1727204607.89489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.89716: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204607.89871: variable 'omit' from source: magic vars 44071 1727204607.89874: starting attempt loop 44071 1727204607.89877: running the handler 44071 1727204607.89879: variable 'lsr_description' from source: include params 44071 1727204607.90090: variable 'lsr_description' from source: include params 44071 1727204607.90108: handler run complete 44071 1727204607.90334: attempt loop complete, returning result 44071 1727204607.90338: variable 'item' from source: unknown 44071 1727204607.90357: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile without autoconnect" } 44071 1727204607.90773: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.90777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.90781: variable 'omit' from source: magic vars 44071 1727204607.91272: variable 'ansible_distribution_major_version' from source: facts 44071 1727204607.91276: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204607.91279: variable 'omit' from source: magic vars 44071 1727204607.91281: variable 'omit' from source: magic vars 44071 1727204607.91471: variable 'item' from source: unknown 44071 1727204607.91549: variable 'item' from source: unknown 44071 1727204607.91575: variable 'omit' from source: magic vars 44071 1727204607.91676: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204607.91693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204607.91705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204607.91865: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204607.91871: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.91874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.91984: Set connection var ansible_connection to ssh 44071 1727204607.91996: Set connection var ansible_timeout to 10 44071 1727204607.92006: Set connection var ansible_pipelining to False 44071 1727204607.92018: Set connection var ansible_shell_type to sh 44071 1727204607.92130: Set connection var ansible_shell_executable to /bin/sh 44071 1727204607.92134: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204607.92143: variable 'ansible_shell_executable' from source: unknown 44071 1727204607.92152: variable 'ansible_connection' from source: unknown 44071 1727204607.92160: variable 'ansible_module_compression' from source: unknown 44071 1727204607.92169: variable 'ansible_shell_type' from source: unknown 44071 1727204607.92197: variable 'ansible_shell_executable' from source: unknown 44071 1727204607.92206: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.92215: variable 'ansible_pipelining' from source: unknown 44071 1727204607.92424: variable 'ansible_timeout' from source: unknown 44071 1727204607.92428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.92535: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204607.92564: variable 'omit' from source: magic vars 44071 1727204607.92628: starting attempt loop 44071 1727204607.92631: running the handler 44071 1727204607.92633: variable 'lsr_setup' from source: include params 44071 1727204607.92848: variable 'lsr_setup' from source: include params 44071 1727204607.93180: handler run complete 44071 1727204607.93183: attempt loop complete, returning result 44071 1727204607.93186: variable 'item' from source: unknown 44071 1727204607.93188: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 44071 1727204607.93616: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.93620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.93623: variable 'omit' from source: magic vars 44071 1727204607.94056: variable 'ansible_distribution_major_version' from source: facts 44071 1727204607.94060: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204607.94063: variable 'omit' from source: magic vars 44071 1727204607.94109: variable 'omit' from source: magic vars 44071 1727204607.94307: variable 'item' from source: unknown 44071 1727204607.94374: variable 'item' from source: unknown 44071 1727204607.94501: variable 'omit' from source: magic vars 44071 1727204607.94531: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204607.94549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204607.94561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204607.94583: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204607.94773: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.94776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.94841: Set connection var ansible_connection to ssh 44071 1727204607.95023: Set connection var ansible_timeout to 10 44071 1727204607.95028: Set connection var ansible_pipelining to False 44071 1727204607.95031: Set connection var ansible_shell_type to sh 44071 1727204607.95033: Set connection var ansible_shell_executable to /bin/sh 44071 1727204607.95036: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204607.95040: variable 'ansible_shell_executable' from source: unknown 44071 1727204607.95043: variable 'ansible_connection' from source: unknown 44071 1727204607.95045: variable 'ansible_module_compression' from source: unknown 44071 1727204607.95048: variable 'ansible_shell_type' from source: unknown 44071 1727204607.95050: variable 'ansible_shell_executable' from source: unknown 44071 1727204607.95052: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.95055: variable 'ansible_pipelining' from source: unknown 44071 1727204607.95057: variable 'ansible_timeout' from source: unknown 44071 1727204607.95059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.95371: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204607.95500: variable 'omit' from source: magic vars 44071 1727204607.95504: starting attempt loop 44071 1727204607.95507: running the handler 44071 1727204607.95509: variable 'lsr_test' from source: include params 44071 1727204607.95623: variable 'lsr_test' from source: include params 44071 1727204607.95651: handler run complete 44071 1727204607.95675: attempt loop complete, returning result 44071 1727204607.95699: variable 'item' from source: unknown 44071 1727204607.95949: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile_no_autoconnect.yml" ] } 44071 1727204607.96250: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.96272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.96287: variable 'omit' from source: magic vars 44071 1727204607.96654: variable 'ansible_distribution_major_version' from source: facts 44071 1727204607.96707: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204607.96716: variable 'omit' from source: magic vars 44071 1727204607.96915: variable 'omit' from source: magic vars 44071 1727204607.96918: variable 'item' from source: unknown 44071 1727204607.97006: variable 'item' from source: unknown 44071 1727204607.97243: variable 'omit' from source: magic vars 44071 1727204607.97247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204607.97250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204607.97253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204607.97255: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204607.97258: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.97260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.97414: Set connection var ansible_connection to ssh 44071 1727204607.97426: Set connection var ansible_timeout to 10 44071 1727204607.97435: Set connection var ansible_pipelining to False 44071 1727204607.97464: Set connection var ansible_shell_type to sh 44071 1727204607.97476: Set connection var ansible_shell_executable to /bin/sh 44071 1727204607.97673: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204607.97676: variable 'ansible_shell_executable' from source: unknown 44071 1727204607.97678: variable 'ansible_connection' from source: unknown 44071 1727204607.97681: variable 'ansible_module_compression' from source: unknown 44071 1727204607.97683: variable 'ansible_shell_type' from source: unknown 44071 1727204607.97685: variable 'ansible_shell_executable' from source: unknown 44071 1727204607.97687: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.97688: variable 'ansible_pipelining' from source: unknown 44071 1727204607.97690: variable 'ansible_timeout' from source: unknown 44071 1727204607.97692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.97846: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204607.97902: variable 'omit' from source: magic vars 44071 1727204607.98071: starting attempt loop 44071 1727204607.98074: running the handler 44071 1727204607.98076: variable 'lsr_assert' from source: include params 44071 1727204607.98194: variable 'lsr_assert' from source: include params 44071 1727204607.98226: handler run complete 44071 1727204607.98250: attempt loop complete, returning result 44071 1727204607.98348: variable 'item' from source: unknown 44071 1727204607.98544: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_absent.yml", "tasks/assert_profile_present.yml" ] } 44071 1727204607.98883: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.98887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204607.98890: variable 'omit' from source: magic vars 44071 1727204607.99369: variable 'ansible_distribution_major_version' from source: facts 44071 1727204607.99435: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204607.99455: variable 'omit' from source: magic vars 44071 1727204607.99480: variable 'omit' from source: magic vars 44071 1727204607.99580: variable 'item' from source: unknown 44071 1727204607.99748: variable 'item' from source: unknown 44071 1727204607.99964: variable 'omit' from source: magic vars 44071 1727204607.99968: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204607.99972: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204607.99974: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204607.99975: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204607.99977: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204607.99979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204608.00233: Set connection var ansible_connection to ssh 44071 1727204608.00345: Set connection var ansible_timeout to 10 44071 1727204608.00348: Set connection var ansible_pipelining to False 44071 1727204608.00350: Set connection var ansible_shell_type to sh 44071 1727204608.00353: Set connection var ansible_shell_executable to /bin/sh 44071 1727204608.00355: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204608.00357: variable 'ansible_shell_executable' from source: unknown 44071 1727204608.00359: variable 'ansible_connection' from source: unknown 44071 1727204608.00361: variable 'ansible_module_compression' from source: unknown 44071 1727204608.00363: variable 'ansible_shell_type' from source: unknown 44071 1727204608.00368: variable 'ansible_shell_executable' from source: unknown 44071 1727204608.00370: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204608.00372: variable 'ansible_pipelining' from source: unknown 44071 1727204608.00374: variable 'ansible_timeout' from source: unknown 44071 1727204608.00375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204608.00587: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204608.00603: variable 'omit' from source: magic vars 44071 1727204608.00612: starting attempt loop 44071 1727204608.00619: running the handler 44071 1727204608.00870: handler run complete 44071 1727204608.01108: attempt loop complete, returning result 44071 1727204608.01112: variable 'item' from source: unknown 44071 1727204608.01182: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 44071 1727204608.01626: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204608.01629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204608.01632: variable 'omit' from source: magic vars 44071 1727204608.02062: variable 'ansible_distribution_major_version' from source: facts 44071 1727204608.02067: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204608.02070: variable 'omit' from source: magic vars 44071 1727204608.02072: variable 'omit' from source: magic vars 44071 1727204608.02075: variable 'item' from source: unknown 44071 1727204608.02243: variable 'item' from source: unknown 44071 1727204608.02303: variable 'omit' from source: magic vars 44071 1727204608.02330: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204608.02381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204608.02396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204608.02416: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204608.02607: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204608.02611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204608.02613: Set connection var ansible_connection to ssh 44071 1727204608.02825: Set connection var ansible_timeout to 10 44071 1727204608.02829: Set connection var ansible_pipelining to False 44071 1727204608.02831: Set connection var ansible_shell_type to sh 44071 1727204608.02833: Set connection var ansible_shell_executable to /bin/sh 44071 1727204608.02835: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204608.02837: variable 'ansible_shell_executable' from source: unknown 44071 1727204608.02841: variable 'ansible_connection' from source: unknown 44071 1727204608.02843: variable 'ansible_module_compression' from source: unknown 44071 1727204608.02845: variable 'ansible_shell_type' from source: unknown 44071 1727204608.02847: variable 'ansible_shell_executable' from source: unknown 44071 1727204608.02849: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204608.02851: variable 'ansible_pipelining' from source: unknown 44071 1727204608.02854: variable 'ansible_timeout' from source: unknown 44071 1727204608.02856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204608.03045: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204608.03167: variable 'omit' from source: magic vars 44071 1727204608.03177: starting attempt loop 44071 1727204608.03184: running the handler 44071 1727204608.03209: variable 'lsr_fail_debug' from source: play vars 44071 1727204608.03572: variable 'lsr_fail_debug' from source: play vars 44071 1727204608.03576: handler run complete 44071 1727204608.03578: attempt loop complete, returning result 44071 1727204608.03580: variable 'item' from source: unknown 44071 1727204608.03663: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 44071 1727204608.03985: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204608.04000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204608.04122: variable 'omit' from source: magic vars 44071 1727204608.04557: variable 'ansible_distribution_major_version' from source: facts 44071 1727204608.04562: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204608.04564: variable 'omit' from source: magic vars 44071 1727204608.04568: variable 'omit' from source: magic vars 44071 1727204608.04578: variable 'item' from source: unknown 44071 1727204608.04661: variable 'item' from source: unknown 44071 1727204608.04791: variable 'omit' from source: magic vars 44071 1727204608.04818: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204608.04991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204608.04995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204608.05002: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204608.05005: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204608.05007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204608.05127: Set connection var ansible_connection to ssh 44071 1727204608.05318: Set connection var ansible_timeout to 10 44071 1727204608.05322: Set connection var ansible_pipelining to False 44071 1727204608.05324: Set connection var ansible_shell_type to sh 44071 1727204608.05327: Set connection var ansible_shell_executable to /bin/sh 44071 1727204608.05329: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204608.05331: variable 'ansible_shell_executable' from source: unknown 44071 1727204608.05333: variable 'ansible_connection' from source: unknown 44071 1727204608.05336: variable 'ansible_module_compression' from source: unknown 44071 1727204608.05338: variable 'ansible_shell_type' from source: unknown 44071 1727204608.05342: variable 'ansible_shell_executable' from source: unknown 44071 1727204608.05344: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204608.05347: variable 'ansible_pipelining' from source: unknown 44071 1727204608.05349: variable 'ansible_timeout' from source: unknown 44071 1727204608.05351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204608.05537: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204608.05756: variable 'omit' from source: magic vars 44071 1727204608.05760: starting attempt loop 44071 1727204608.05762: running the handler 44071 1727204608.05765: variable 'lsr_cleanup' from source: include params 44071 1727204608.05890: variable 'lsr_cleanup' from source: include params 44071 1727204608.05914: handler run complete 44071 1727204608.05933: attempt loop complete, returning result 44071 1727204608.05992: variable 'item' from source: unknown 44071 1727204608.06170: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 44071 1727204608.06558: dumping result to json 44071 1727204608.06562: done dumping result, returning 44071 1727204608.06564: done running TaskExecutor() for managed-node2/TASK: Show item [127b8e07-fff9-c964-7471-0000000005b5] 44071 1727204608.06570: sending task result for task 127b8e07-fff9-c964-7471-0000000005b5 44071 1727204608.06626: done sending task result for task 127b8e07-fff9-c964-7471-0000000005b5 44071 1727204608.06629: WORKER PROCESS EXITING 44071 1727204608.06696: no more pending results, returning what we have 44071 1727204608.06700: results queue empty 44071 1727204608.06701: checking for any_errors_fatal 44071 1727204608.06711: done checking for any_errors_fatal 44071 1727204608.06712: checking for max_fail_percentage 44071 1727204608.06713: done checking for max_fail_percentage 44071 1727204608.06714: checking to see if all hosts have failed and the running result is not ok 44071 1727204608.06715: done checking to see if all hosts have failed 44071 1727204608.06716: getting the remaining hosts for this loop 44071 1727204608.06717: done getting the remaining hosts for this loop 44071 1727204608.06725: getting the next task for host managed-node2 44071 1727204608.06733: done getting next task for host managed-node2 44071 1727204608.06736: ^ task is: TASK: Include the task 'show_interfaces.yml' 44071 1727204608.06743: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204608.06747: getting variables 44071 1727204608.06749: in VariableManager get_vars() 44071 1727204608.06786: Calling all_inventory to load vars for managed-node2 44071 1727204608.06789: Calling groups_inventory to load vars for managed-node2 44071 1727204608.06794: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204608.06808: Calling all_plugins_play to load vars for managed-node2 44071 1727204608.06812: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204608.06815: Calling groups_plugins_play to load vars for managed-node2 44071 1727204608.11026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204608.15724: done with get_vars() 44071 1727204608.15774: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 15:03:28 -0400 (0:00:00.316) 0:00:20.477 ***** 44071 1727204608.16094: entering _queue_task() for managed-node2/include_tasks 44071 1727204608.17109: worker is 1 (out of 1 available) 44071 1727204608.17122: exiting _queue_task() for managed-node2/include_tasks 44071 1727204608.17135: done queuing things up, now waiting for results queue to drain 44071 1727204608.17137: waiting for pending results... 44071 1727204608.17589: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 44071 1727204608.17740: in run() - task 127b8e07-fff9-c964-7471-0000000005b6 44071 1727204608.17813: variable 'ansible_search_path' from source: unknown 44071 1727204608.17822: variable 'ansible_search_path' from source: unknown 44071 1727204608.17918: calling self._execute() 44071 1727204608.18340: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204608.18345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204608.18348: variable 'omit' from source: magic vars 44071 1727204608.19083: variable 'ansible_distribution_major_version' from source: facts 44071 1727204608.19111: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204608.19285: _execute() done 44071 1727204608.19394: dumping result to json 44071 1727204608.19398: done dumping result, returning 44071 1727204608.19400: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-c964-7471-0000000005b6] 44071 1727204608.19403: sending task result for task 127b8e07-fff9-c964-7471-0000000005b6 44071 1727204608.19498: done sending task result for task 127b8e07-fff9-c964-7471-0000000005b6 44071 1727204608.19534: no more pending results, returning what we have 44071 1727204608.19543: in VariableManager get_vars() 44071 1727204608.19589: Calling all_inventory to load vars for managed-node2 44071 1727204608.19592: Calling groups_inventory to load vars for managed-node2 44071 1727204608.19597: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204608.19615: Calling all_plugins_play to load vars for managed-node2 44071 1727204608.19618: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204608.19623: Calling groups_plugins_play to load vars for managed-node2 44071 1727204608.20576: WORKER PROCESS EXITING 44071 1727204608.24743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204608.37310: done with get_vars() 44071 1727204608.37405: variable 'ansible_search_path' from source: unknown 44071 1727204608.37407: variable 'ansible_search_path' from source: unknown 44071 1727204608.37449: we have included files to process 44071 1727204608.37451: generating all_blocks data 44071 1727204608.37452: done generating all_blocks data 44071 1727204608.37455: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204608.37456: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204608.37458: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204608.37747: in VariableManager get_vars() 44071 1727204608.38039: done with get_vars() 44071 1727204608.38386: done processing included file 44071 1727204608.38388: iterating over new_blocks loaded from include file 44071 1727204608.38390: in VariableManager get_vars() 44071 1727204608.38407: done with get_vars() 44071 1727204608.38408: filtering new block on tags 44071 1727204608.38445: done filtering new block on tags 44071 1727204608.38448: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 44071 1727204608.38452: extending task lists for all hosts with included blocks 44071 1727204608.39904: done extending task lists 44071 1727204608.39906: done processing included files 44071 1727204608.39907: results queue empty 44071 1727204608.39908: checking for any_errors_fatal 44071 1727204608.39914: done checking for any_errors_fatal 44071 1727204608.39915: checking for max_fail_percentage 44071 1727204608.39935: done checking for max_fail_percentage 44071 1727204608.39937: checking to see if all hosts have failed and the running result is not ok 44071 1727204608.39938: done checking to see if all hosts have failed 44071 1727204608.39939: getting the remaining hosts for this loop 44071 1727204608.39941: done getting the remaining hosts for this loop 44071 1727204608.39944: getting the next task for host managed-node2 44071 1727204608.39949: done getting next task for host managed-node2 44071 1727204608.39951: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 44071 1727204608.39954: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204608.39957: getting variables 44071 1727204608.39962: in VariableManager get_vars() 44071 1727204608.39981: Calling all_inventory to load vars for managed-node2 44071 1727204608.39984: Calling groups_inventory to load vars for managed-node2 44071 1727204608.39987: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204608.39994: Calling all_plugins_play to load vars for managed-node2 44071 1727204608.39996: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204608.40000: Calling groups_plugins_play to load vars for managed-node2 44071 1727204608.42149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204608.47074: done with get_vars() 44071 1727204608.47133: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:03:28 -0400 (0:00:00.312) 0:00:20.790 ***** 44071 1727204608.47359: entering _queue_task() for managed-node2/include_tasks 44071 1727204608.48235: worker is 1 (out of 1 available) 44071 1727204608.48250: exiting _queue_task() for managed-node2/include_tasks 44071 1727204608.48265: done queuing things up, now waiting for results queue to drain 44071 1727204608.48268: waiting for pending results... 44071 1727204608.48601: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 44071 1727204608.49130: in run() - task 127b8e07-fff9-c964-7471-0000000005dd 44071 1727204608.49135: variable 'ansible_search_path' from source: unknown 44071 1727204608.49141: variable 'ansible_search_path' from source: unknown 44071 1727204608.49145: calling self._execute() 44071 1727204608.49573: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204608.49578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204608.49583: variable 'omit' from source: magic vars 44071 1727204608.50382: variable 'ansible_distribution_major_version' from source: facts 44071 1727204608.50396: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204608.50403: _execute() done 44071 1727204608.50458: dumping result to json 44071 1727204608.50462: done dumping result, returning 44071 1727204608.50472: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-c964-7471-0000000005dd] 44071 1727204608.50489: sending task result for task 127b8e07-fff9-c964-7471-0000000005dd 44071 1727204608.50677: no more pending results, returning what we have 44071 1727204608.50683: in VariableManager get_vars() 44071 1727204608.50725: Calling all_inventory to load vars for managed-node2 44071 1727204608.50728: Calling groups_inventory to load vars for managed-node2 44071 1727204608.50733: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204608.50749: Calling all_plugins_play to load vars for managed-node2 44071 1727204608.50753: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204608.50757: Calling groups_plugins_play to load vars for managed-node2 44071 1727204608.51609: done sending task result for task 127b8e07-fff9-c964-7471-0000000005dd 44071 1727204608.51614: WORKER PROCESS EXITING 44071 1727204608.53174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204608.56196: done with get_vars() 44071 1727204608.56232: variable 'ansible_search_path' from source: unknown 44071 1727204608.56234: variable 'ansible_search_path' from source: unknown 44071 1727204608.56330: we have included files to process 44071 1727204608.56331: generating all_blocks data 44071 1727204608.56333: done generating all_blocks data 44071 1727204608.56334: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204608.56336: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204608.56340: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204608.56876: done processing included file 44071 1727204608.56878: iterating over new_blocks loaded from include file 44071 1727204608.56880: in VariableManager get_vars() 44071 1727204608.56900: done with get_vars() 44071 1727204608.56903: filtering new block on tags 44071 1727204608.56952: done filtering new block on tags 44071 1727204608.56955: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 44071 1727204608.56961: extending task lists for all hosts with included blocks 44071 1727204608.57240: done extending task lists 44071 1727204608.57241: done processing included files 44071 1727204608.57242: results queue empty 44071 1727204608.57243: checking for any_errors_fatal 44071 1727204608.57247: done checking for any_errors_fatal 44071 1727204608.57248: checking for max_fail_percentage 44071 1727204608.57249: done checking for max_fail_percentage 44071 1727204608.57250: checking to see if all hosts have failed and the running result is not ok 44071 1727204608.57251: done checking to see if all hosts have failed 44071 1727204608.57252: getting the remaining hosts for this loop 44071 1727204608.57253: done getting the remaining hosts for this loop 44071 1727204608.57256: getting the next task for host managed-node2 44071 1727204608.57262: done getting next task for host managed-node2 44071 1727204608.57264: ^ task is: TASK: Gather current interface info 44071 1727204608.57280: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204608.57283: getting variables 44071 1727204608.57285: in VariableManager get_vars() 44071 1727204608.57296: Calling all_inventory to load vars for managed-node2 44071 1727204608.57299: Calling groups_inventory to load vars for managed-node2 44071 1727204608.57302: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204608.57309: Calling all_plugins_play to load vars for managed-node2 44071 1727204608.57311: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204608.57314: Calling groups_plugins_play to load vars for managed-node2 44071 1727204608.59349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204608.61612: done with get_vars() 44071 1727204608.61650: done getting variables 44071 1727204608.61704: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:03:28 -0400 (0:00:00.143) 0:00:20.933 ***** 44071 1727204608.61740: entering _queue_task() for managed-node2/command 44071 1727204608.62137: worker is 1 (out of 1 available) 44071 1727204608.62153: exiting _queue_task() for managed-node2/command 44071 1727204608.62381: done queuing things up, now waiting for results queue to drain 44071 1727204608.62383: waiting for pending results... 44071 1727204608.62625: running TaskExecutor() for managed-node2/TASK: Gather current interface info 44071 1727204608.62840: in run() - task 127b8e07-fff9-c964-7471-000000000618 44071 1727204608.62847: variable 'ansible_search_path' from source: unknown 44071 1727204608.62850: variable 'ansible_search_path' from source: unknown 44071 1727204608.62920: calling self._execute() 44071 1727204608.63116: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204608.63146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204608.63176: variable 'omit' from source: magic vars 44071 1727204608.63843: variable 'ansible_distribution_major_version' from source: facts 44071 1727204608.63867: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204608.63933: variable 'omit' from source: magic vars 44071 1727204608.63995: variable 'omit' from source: magic vars 44071 1727204608.64054: variable 'omit' from source: magic vars 44071 1727204608.64107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204608.64170: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204608.64216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204608.64285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204608.64295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204608.64332: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204608.64341: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204608.64352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204608.64508: Set connection var ansible_connection to ssh 44071 1727204608.64512: Set connection var ansible_timeout to 10 44071 1727204608.64598: Set connection var ansible_pipelining to False 44071 1727204608.64601: Set connection var ansible_shell_type to sh 44071 1727204608.64604: Set connection var ansible_shell_executable to /bin/sh 44071 1727204608.64607: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204608.64609: variable 'ansible_shell_executable' from source: unknown 44071 1727204608.64613: variable 'ansible_connection' from source: unknown 44071 1727204608.64618: variable 'ansible_module_compression' from source: unknown 44071 1727204608.64634: variable 'ansible_shell_type' from source: unknown 44071 1727204608.64643: variable 'ansible_shell_executable' from source: unknown 44071 1727204608.64651: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204608.64662: variable 'ansible_pipelining' from source: unknown 44071 1727204608.64704: variable 'ansible_timeout' from source: unknown 44071 1727204608.64771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204608.64939: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204608.64972: variable 'omit' from source: magic vars 44071 1727204608.64975: starting attempt loop 44071 1727204608.64977: running the handler 44071 1727204608.65104: _low_level_execute_command(): starting 44071 1727204608.65107: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204608.65992: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204608.66057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204608.66154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204608.66270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204608.68053: stdout chunk (state=3): >>>/root <<< 44071 1727204608.68191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204608.68269: stderr chunk (state=3): >>><<< 44071 1727204608.68298: stdout chunk (state=3): >>><<< 44071 1727204608.68431: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204608.68436: _low_level_execute_command(): starting 44071 1727204608.68440: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204608.6832354-45076-180863303177350 `" && echo ansible-tmp-1727204608.6832354-45076-180863303177350="` echo /root/.ansible/tmp/ansible-tmp-1727204608.6832354-45076-180863303177350 `" ) && sleep 0' 44071 1727204608.69069: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204608.69187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204608.69233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204608.69355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204608.71347: stdout chunk (state=3): >>>ansible-tmp-1727204608.6832354-45076-180863303177350=/root/.ansible/tmp/ansible-tmp-1727204608.6832354-45076-180863303177350 <<< 44071 1727204608.71594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204608.71873: stderr chunk (state=3): >>><<< 44071 1727204608.71878: stdout chunk (state=3): >>><<< 44071 1727204608.71881: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204608.6832354-45076-180863303177350=/root/.ansible/tmp/ansible-tmp-1727204608.6832354-45076-180863303177350 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204608.71883: variable 'ansible_module_compression' from source: unknown 44071 1727204608.71886: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204608.71888: variable 'ansible_facts' from source: unknown 44071 1727204608.71929: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204608.6832354-45076-180863303177350/AnsiballZ_command.py 44071 1727204608.72195: Sending initial data 44071 1727204608.72210: Sent initial data (156 bytes) 44071 1727204608.72791: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204608.72808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204608.72824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204608.72844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204608.72870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204608.72883: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204608.72895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204608.72911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204608.72925: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204608.72936: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204608.72980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204608.73035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204608.73052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204608.73080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204608.73249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204608.74811: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204608.74868: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204608.74964: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204608.75032: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpfo3igk0p /root/.ansible/tmp/ansible-tmp-1727204608.6832354-45076-180863303177350/AnsiballZ_command.py <<< 44071 1727204608.75036: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204608.6832354-45076-180863303177350/AnsiballZ_command.py" <<< 44071 1727204608.75118: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpfo3igk0p" to remote "/root/.ansible/tmp/ansible-tmp-1727204608.6832354-45076-180863303177350/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204608.6832354-45076-180863303177350/AnsiballZ_command.py" <<< 44071 1727204608.76360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204608.76702: stderr chunk (state=3): >>><<< 44071 1727204608.76707: stdout chunk (state=3): >>><<< 44071 1727204608.76709: done transferring module to remote 44071 1727204608.76712: _low_level_execute_command(): starting 44071 1727204608.76722: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204608.6832354-45076-180863303177350/ /root/.ansible/tmp/ansible-tmp-1727204608.6832354-45076-180863303177350/AnsiballZ_command.py && sleep 0' 44071 1727204608.78362: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204608.78563: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204608.78633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204608.78707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204608.80671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204608.80753: stdout chunk (state=3): >>><<< 44071 1727204608.80757: stderr chunk (state=3): >>><<< 44071 1727204608.80920: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204608.80924: _low_level_execute_command(): starting 44071 1727204608.80928: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204608.6832354-45076-180863303177350/AnsiballZ_command.py && sleep 0' 44071 1727204608.81667: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204608.81686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204608.81700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204608.81772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204608.81787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204608.81870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204608.81890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204608.81914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204608.82065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204608.98754: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:28.982597", "end": "2024-09-24 15:03:28.986004", "delta": "0:00:00.003407", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204609.00612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204609.00617: stdout chunk (state=3): >>><<< 44071 1727204609.00620: stderr chunk (state=3): >>><<< 44071 1727204609.00622: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:28.982597", "end": "2024-09-24 15:03:28.986004", "delta": "0:00:00.003407", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204609.00625: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204608.6832354-45076-180863303177350/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204609.00627: _low_level_execute_command(): starting 44071 1727204609.00630: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204608.6832354-45076-180863303177350/ > /dev/null 2>&1 && sleep 0' 44071 1727204609.02063: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204609.02226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204609.02242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204609.02351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204609.04326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204609.04355: stderr chunk (state=3): >>><<< 44071 1727204609.04429: stdout chunk (state=3): >>><<< 44071 1727204609.04449: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204609.04453: handler run complete 44071 1727204609.04672: Evaluated conditional (False): False 44071 1727204609.04676: attempt loop complete, returning result 44071 1727204609.04679: _execute() done 44071 1727204609.04681: dumping result to json 44071 1727204609.04683: done dumping result, returning 44071 1727204609.04685: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [127b8e07-fff9-c964-7471-000000000618] 44071 1727204609.04687: sending task result for task 127b8e07-fff9-c964-7471-000000000618 44071 1727204609.04768: done sending task result for task 127b8e07-fff9-c964-7471-000000000618 44071 1727204609.04772: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003407", "end": "2024-09-24 15:03:28.986004", "rc": 0, "start": "2024-09-24 15:03:28.982597" } STDOUT: bonding_masters eth0 lo 44071 1727204609.04859: no more pending results, returning what we have 44071 1727204609.04863: results queue empty 44071 1727204609.04864: checking for any_errors_fatal 44071 1727204609.04869: done checking for any_errors_fatal 44071 1727204609.04869: checking for max_fail_percentage 44071 1727204609.04871: done checking for max_fail_percentage 44071 1727204609.04871: checking to see if all hosts have failed and the running result is not ok 44071 1727204609.04872: done checking to see if all hosts have failed 44071 1727204609.04873: getting the remaining hosts for this loop 44071 1727204609.04874: done getting the remaining hosts for this loop 44071 1727204609.04879: getting the next task for host managed-node2 44071 1727204609.04888: done getting next task for host managed-node2 44071 1727204609.04891: ^ task is: TASK: Set current_interfaces 44071 1727204609.04897: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204609.04901: getting variables 44071 1727204609.04903: in VariableManager get_vars() 44071 1727204609.04936: Calling all_inventory to load vars for managed-node2 44071 1727204609.04941: Calling groups_inventory to load vars for managed-node2 44071 1727204609.04944: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204609.04959: Calling all_plugins_play to load vars for managed-node2 44071 1727204609.04962: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204609.05082: Calling groups_plugins_play to load vars for managed-node2 44071 1727204609.08744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204609.13939: done with get_vars() 44071 1727204609.14080: done getting variables 44071 1727204609.14139: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.525) 0:00:21.459 ***** 44071 1727204609.14310: entering _queue_task() for managed-node2/set_fact 44071 1727204609.15125: worker is 1 (out of 1 available) 44071 1727204609.15141: exiting _queue_task() for managed-node2/set_fact 44071 1727204609.15155: done queuing things up, now waiting for results queue to drain 44071 1727204609.15157: waiting for pending results... 44071 1727204609.15787: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 44071 1727204609.16046: in run() - task 127b8e07-fff9-c964-7471-000000000619 44071 1727204609.16052: variable 'ansible_search_path' from source: unknown 44071 1727204609.16055: variable 'ansible_search_path' from source: unknown 44071 1727204609.16185: calling self._execute() 44071 1727204609.16401: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204609.16406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204609.16409: variable 'omit' from source: magic vars 44071 1727204609.17214: variable 'ansible_distribution_major_version' from source: facts 44071 1727204609.17218: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204609.17223: variable 'omit' from source: magic vars 44071 1727204609.17311: variable 'omit' from source: magic vars 44071 1727204609.17553: variable '_current_interfaces' from source: set_fact 44071 1727204609.17673: variable 'omit' from source: magic vars 44071 1727204609.17692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204609.17730: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204609.17755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204609.17786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204609.17806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204609.17893: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204609.17896: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204609.17899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204609.17974: Set connection var ansible_connection to ssh 44071 1727204609.17986: Set connection var ansible_timeout to 10 44071 1727204609.18002: Set connection var ansible_pipelining to False 44071 1727204609.18012: Set connection var ansible_shell_type to sh 44071 1727204609.18024: Set connection var ansible_shell_executable to /bin/sh 44071 1727204609.18036: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204609.18064: variable 'ansible_shell_executable' from source: unknown 44071 1727204609.18109: variable 'ansible_connection' from source: unknown 44071 1727204609.18112: variable 'ansible_module_compression' from source: unknown 44071 1727204609.18114: variable 'ansible_shell_type' from source: unknown 44071 1727204609.18116: variable 'ansible_shell_executable' from source: unknown 44071 1727204609.18118: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204609.18119: variable 'ansible_pipelining' from source: unknown 44071 1727204609.18121: variable 'ansible_timeout' from source: unknown 44071 1727204609.18123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204609.18274: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204609.18293: variable 'omit' from source: magic vars 44071 1727204609.18325: starting attempt loop 44071 1727204609.18328: running the handler 44071 1727204609.18331: handler run complete 44071 1727204609.18345: attempt loop complete, returning result 44071 1727204609.18352: _execute() done 44071 1727204609.18371: dumping result to json 44071 1727204609.18374: done dumping result, returning 44071 1727204609.18435: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [127b8e07-fff9-c964-7471-000000000619] 44071 1727204609.18438: sending task result for task 127b8e07-fff9-c964-7471-000000000619 44071 1727204609.18515: done sending task result for task 127b8e07-fff9-c964-7471-000000000619 44071 1727204609.18518: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 44071 1727204609.18613: no more pending results, returning what we have 44071 1727204609.18618: results queue empty 44071 1727204609.18619: checking for any_errors_fatal 44071 1727204609.18631: done checking for any_errors_fatal 44071 1727204609.18632: checking for max_fail_percentage 44071 1727204609.18633: done checking for max_fail_percentage 44071 1727204609.18634: checking to see if all hosts have failed and the running result is not ok 44071 1727204609.18635: done checking to see if all hosts have failed 44071 1727204609.18636: getting the remaining hosts for this loop 44071 1727204609.18638: done getting the remaining hosts for this loop 44071 1727204609.18647: getting the next task for host managed-node2 44071 1727204609.18664: done getting next task for host managed-node2 44071 1727204609.18672: ^ task is: TASK: Show current_interfaces 44071 1727204609.18678: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204609.18683: getting variables 44071 1727204609.18685: in VariableManager get_vars() 44071 1727204609.18720: Calling all_inventory to load vars for managed-node2 44071 1727204609.18722: Calling groups_inventory to load vars for managed-node2 44071 1727204609.18726: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204609.18740: Calling all_plugins_play to load vars for managed-node2 44071 1727204609.18744: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204609.18747: Calling groups_plugins_play to load vars for managed-node2 44071 1727204609.21194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204609.24027: done with get_vars() 44071 1727204609.24099: done getting variables 44071 1727204609.24189: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.099) 0:00:21.558 ***** 44071 1727204609.24230: entering _queue_task() for managed-node2/debug 44071 1727204609.24748: worker is 1 (out of 1 available) 44071 1727204609.24763: exiting _queue_task() for managed-node2/debug 44071 1727204609.24778: done queuing things up, now waiting for results queue to drain 44071 1727204609.24780: waiting for pending results... 44071 1727204609.25089: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 44071 1727204609.25140: in run() - task 127b8e07-fff9-c964-7471-0000000005de 44071 1727204609.25168: variable 'ansible_search_path' from source: unknown 44071 1727204609.25183: variable 'ansible_search_path' from source: unknown 44071 1727204609.25273: calling self._execute() 44071 1727204609.25341: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204609.25354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204609.25373: variable 'omit' from source: magic vars 44071 1727204609.25841: variable 'ansible_distribution_major_version' from source: facts 44071 1727204609.25863: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204609.25878: variable 'omit' from source: magic vars 44071 1727204609.26044: variable 'omit' from source: magic vars 44071 1727204609.26070: variable 'current_interfaces' from source: set_fact 44071 1727204609.26109: variable 'omit' from source: magic vars 44071 1727204609.26172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204609.26219: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204609.26250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204609.26286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204609.26308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204609.26347: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204609.26356: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204609.26368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204609.26498: Set connection var ansible_connection to ssh 44071 1727204609.26511: Set connection var ansible_timeout to 10 44071 1727204609.26522: Set connection var ansible_pipelining to False 44071 1727204609.26531: Set connection var ansible_shell_type to sh 44071 1727204609.26540: Set connection var ansible_shell_executable to /bin/sh 44071 1727204609.26551: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204609.26586: variable 'ansible_shell_executable' from source: unknown 44071 1727204609.26600: variable 'ansible_connection' from source: unknown 44071 1727204609.26608: variable 'ansible_module_compression' from source: unknown 44071 1727204609.26671: variable 'ansible_shell_type' from source: unknown 44071 1727204609.26675: variable 'ansible_shell_executable' from source: unknown 44071 1727204609.26677: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204609.26679: variable 'ansible_pipelining' from source: unknown 44071 1727204609.26681: variable 'ansible_timeout' from source: unknown 44071 1727204609.26683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204609.26823: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204609.26843: variable 'omit' from source: magic vars 44071 1727204609.26854: starting attempt loop 44071 1727204609.26861: running the handler 44071 1727204609.26927: handler run complete 44071 1727204609.26947: attempt loop complete, returning result 44071 1727204609.26955: _execute() done 44071 1727204609.26963: dumping result to json 44071 1727204609.27017: done dumping result, returning 44071 1727204609.27021: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [127b8e07-fff9-c964-7471-0000000005de] 44071 1727204609.27026: sending task result for task 127b8e07-fff9-c964-7471-0000000005de 44071 1727204609.27399: done sending task result for task 127b8e07-fff9-c964-7471-0000000005de 44071 1727204609.27403: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 44071 1727204609.27449: no more pending results, returning what we have 44071 1727204609.27452: results queue empty 44071 1727204609.27453: checking for any_errors_fatal 44071 1727204609.27459: done checking for any_errors_fatal 44071 1727204609.27460: checking for max_fail_percentage 44071 1727204609.27462: done checking for max_fail_percentage 44071 1727204609.27463: checking to see if all hosts have failed and the running result is not ok 44071 1727204609.27463: done checking to see if all hosts have failed 44071 1727204609.27464: getting the remaining hosts for this loop 44071 1727204609.27468: done getting the remaining hosts for this loop 44071 1727204609.27472: getting the next task for host managed-node2 44071 1727204609.27480: done getting next task for host managed-node2 44071 1727204609.27483: ^ task is: TASK: Setup 44071 1727204609.27486: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204609.27490: getting variables 44071 1727204609.27491: in VariableManager get_vars() 44071 1727204609.27528: Calling all_inventory to load vars for managed-node2 44071 1727204609.27531: Calling groups_inventory to load vars for managed-node2 44071 1727204609.27534: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204609.27545: Calling all_plugins_play to load vars for managed-node2 44071 1727204609.27548: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204609.27551: Calling groups_plugins_play to load vars for managed-node2 44071 1727204609.30995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204609.36284: done with get_vars() 44071 1727204609.36327: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.124) 0:00:21.682 ***** 44071 1727204609.36716: entering _queue_task() for managed-node2/include_tasks 44071 1727204609.37495: worker is 1 (out of 1 available) 44071 1727204609.37512: exiting _queue_task() for managed-node2/include_tasks 44071 1727204609.37526: done queuing things up, now waiting for results queue to drain 44071 1727204609.37528: waiting for pending results... 44071 1727204609.37883: running TaskExecutor() for managed-node2/TASK: Setup 44071 1727204609.38225: in run() - task 127b8e07-fff9-c964-7471-0000000005b7 44071 1727204609.38294: variable 'ansible_search_path' from source: unknown 44071 1727204609.38298: variable 'ansible_search_path' from source: unknown 44071 1727204609.38320: variable 'lsr_setup' from source: include params 44071 1727204609.39215: variable 'lsr_setup' from source: include params 44071 1727204609.39220: variable 'omit' from source: magic vars 44071 1727204609.39575: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204609.39658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204609.39774: variable 'omit' from source: magic vars 44071 1727204609.40430: variable 'ansible_distribution_major_version' from source: facts 44071 1727204609.40456: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204609.40525: variable 'item' from source: unknown 44071 1727204609.40843: variable 'item' from source: unknown 44071 1727204609.40847: variable 'item' from source: unknown 44071 1727204609.40984: variable 'item' from source: unknown 44071 1727204609.41512: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204609.41516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204609.41518: variable 'omit' from source: magic vars 44071 1727204609.41863: variable 'ansible_distribution_major_version' from source: facts 44071 1727204609.41868: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204609.41891: variable 'item' from source: unknown 44071 1727204609.42104: variable 'item' from source: unknown 44071 1727204609.42243: variable 'item' from source: unknown 44071 1727204609.42379: variable 'item' from source: unknown 44071 1727204609.42742: dumping result to json 44071 1727204609.42746: done dumping result, returning 44071 1727204609.42749: done running TaskExecutor() for managed-node2/TASK: Setup [127b8e07-fff9-c964-7471-0000000005b7] 44071 1727204609.42751: sending task result for task 127b8e07-fff9-c964-7471-0000000005b7 44071 1727204609.43248: done sending task result for task 127b8e07-fff9-c964-7471-0000000005b7 44071 1727204609.43252: WORKER PROCESS EXITING 44071 1727204609.43334: no more pending results, returning what we have 44071 1727204609.43343: in VariableManager get_vars() 44071 1727204609.43394: Calling all_inventory to load vars for managed-node2 44071 1727204609.43398: Calling groups_inventory to load vars for managed-node2 44071 1727204609.43402: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204609.43428: Calling all_plugins_play to load vars for managed-node2 44071 1727204609.43432: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204609.43436: Calling groups_plugins_play to load vars for managed-node2 44071 1727204609.49942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204609.56783: done with get_vars() 44071 1727204609.56819: variable 'ansible_search_path' from source: unknown 44071 1727204609.56822: variable 'ansible_search_path' from source: unknown 44071 1727204609.57077: variable 'ansible_search_path' from source: unknown 44071 1727204609.57078: variable 'ansible_search_path' from source: unknown 44071 1727204609.57117: we have included files to process 44071 1727204609.57118: generating all_blocks data 44071 1727204609.57120: done generating all_blocks data 44071 1727204609.57124: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 44071 1727204609.57126: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 44071 1727204609.57128: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 44071 1727204609.57540: done processing included file 44071 1727204609.57543: iterating over new_blocks loaded from include file 44071 1727204609.57544: in VariableManager get_vars() 44071 1727204609.57564: done with get_vars() 44071 1727204609.57569: filtering new block on tags 44071 1727204609.57601: done filtering new block on tags 44071 1727204609.57604: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node2 => (item=tasks/delete_interface.yml) 44071 1727204609.57610: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44071 1727204609.57611: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44071 1727204609.57616: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44071 1727204609.57821: in VariableManager get_vars() 44071 1727204609.57845: done with get_vars() 44071 1727204609.57952: done processing included file 44071 1727204609.57954: iterating over new_blocks loaded from include file 44071 1727204609.57955: in VariableManager get_vars() 44071 1727204609.58178: done with get_vars() 44071 1727204609.58180: filtering new block on tags 44071 1727204609.58219: done filtering new block on tags 44071 1727204609.58222: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 => (item=tasks/assert_device_absent.yml) 44071 1727204609.58227: extending task lists for all hosts with included blocks 44071 1727204609.59824: done extending task lists 44071 1727204609.59826: done processing included files 44071 1727204609.59826: results queue empty 44071 1727204609.59827: checking for any_errors_fatal 44071 1727204609.59832: done checking for any_errors_fatal 44071 1727204609.59833: checking for max_fail_percentage 44071 1727204609.59834: done checking for max_fail_percentage 44071 1727204609.59835: checking to see if all hosts have failed and the running result is not ok 44071 1727204609.59836: done checking to see if all hosts have failed 44071 1727204609.59836: getting the remaining hosts for this loop 44071 1727204609.59838: done getting the remaining hosts for this loop 44071 1727204609.59841: getting the next task for host managed-node2 44071 1727204609.59846: done getting next task for host managed-node2 44071 1727204609.59849: ^ task is: TASK: Remove test interface if necessary 44071 1727204609.59852: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204609.59855: getting variables 44071 1727204609.59856: in VariableManager get_vars() 44071 1727204609.59929: Calling all_inventory to load vars for managed-node2 44071 1727204609.59933: Calling groups_inventory to load vars for managed-node2 44071 1727204609.59936: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204609.59943: Calling all_plugins_play to load vars for managed-node2 44071 1727204609.59946: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204609.59949: Calling groups_plugins_play to load vars for managed-node2 44071 1727204609.63369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204609.67109: done with get_vars() 44071 1727204609.67157: done getting variables 44071 1727204609.67210: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.306) 0:00:21.988 ***** 44071 1727204609.67255: entering _queue_task() for managed-node2/command 44071 1727204609.68117: worker is 1 (out of 1 available) 44071 1727204609.68137: exiting _queue_task() for managed-node2/command 44071 1727204609.68153: done queuing things up, now waiting for results queue to drain 44071 1727204609.68155: waiting for pending results... 44071 1727204609.68621: running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary 44071 1727204609.69010: in run() - task 127b8e07-fff9-c964-7471-00000000063e 44071 1727204609.69015: variable 'ansible_search_path' from source: unknown 44071 1727204609.69018: variable 'ansible_search_path' from source: unknown 44071 1727204609.69200: calling self._execute() 44071 1727204609.69499: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204609.69526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204609.69531: variable 'omit' from source: magic vars 44071 1727204609.70363: variable 'ansible_distribution_major_version' from source: facts 44071 1727204609.70426: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204609.70435: variable 'omit' from source: magic vars 44071 1727204609.70474: variable 'omit' from source: magic vars 44071 1727204609.70918: variable 'interface' from source: play vars 44071 1727204609.70923: variable 'omit' from source: magic vars 44071 1727204609.70925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204609.70928: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204609.70930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204609.71057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204609.71075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204609.71318: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204609.71322: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204609.71325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204609.71444: Set connection var ansible_connection to ssh 44071 1727204609.71447: Set connection var ansible_timeout to 10 44071 1727204609.71516: Set connection var ansible_pipelining to False 44071 1727204609.71519: Set connection var ansible_shell_type to sh 44071 1727204609.71522: Set connection var ansible_shell_executable to /bin/sh 44071 1727204609.71595: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204609.71626: variable 'ansible_shell_executable' from source: unknown 44071 1727204609.71630: variable 'ansible_connection' from source: unknown 44071 1727204609.71633: variable 'ansible_module_compression' from source: unknown 44071 1727204609.71637: variable 'ansible_shell_type' from source: unknown 44071 1727204609.71642: variable 'ansible_shell_executable' from source: unknown 44071 1727204609.71645: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204609.71647: variable 'ansible_pipelining' from source: unknown 44071 1727204609.71650: variable 'ansible_timeout' from source: unknown 44071 1727204609.71652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204609.71995: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204609.72010: variable 'omit' from source: magic vars 44071 1727204609.72013: starting attempt loop 44071 1727204609.72016: running the handler 44071 1727204609.72151: _low_level_execute_command(): starting 44071 1727204609.72160: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204609.73298: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204609.73371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204609.73427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204609.73522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204609.75334: stdout chunk (state=3): >>>/root <<< 44071 1727204609.75556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204609.75561: stdout chunk (state=3): >>><<< 44071 1727204609.75564: stderr chunk (state=3): >>><<< 44071 1727204609.75592: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204609.75624: _low_level_execute_command(): starting 44071 1727204609.75663: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204609.756001-45110-9399778514238 `" && echo ansible-tmp-1727204609.756001-45110-9399778514238="` echo /root/.ansible/tmp/ansible-tmp-1727204609.756001-45110-9399778514238 `" ) && sleep 0' 44071 1727204609.76401: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204609.76474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204609.76478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204609.76506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204609.76585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204609.78755: stdout chunk (state=3): >>>ansible-tmp-1727204609.756001-45110-9399778514238=/root/.ansible/tmp/ansible-tmp-1727204609.756001-45110-9399778514238 <<< 44071 1727204609.78915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204609.79121: stderr chunk (state=3): >>><<< 44071 1727204609.79124: stdout chunk (state=3): >>><<< 44071 1727204609.79127: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204609.756001-45110-9399778514238=/root/.ansible/tmp/ansible-tmp-1727204609.756001-45110-9399778514238 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204609.79130: variable 'ansible_module_compression' from source: unknown 44071 1727204609.79203: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204609.79235: variable 'ansible_facts' from source: unknown 44071 1727204609.79322: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204609.756001-45110-9399778514238/AnsiballZ_command.py 44071 1727204609.79580: Sending initial data 44071 1727204609.79584: Sent initial data (153 bytes) 44071 1727204609.80226: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204609.80259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204609.80301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204609.80337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204609.80438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204609.82080: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204609.82193: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204609.82269: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp_drts_9a /root/.ansible/tmp/ansible-tmp-1727204609.756001-45110-9399778514238/AnsiballZ_command.py <<< 44071 1727204609.82297: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204609.756001-45110-9399778514238/AnsiballZ_command.py" <<< 44071 1727204609.82377: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp_drts_9a" to remote "/root/.ansible/tmp/ansible-tmp-1727204609.756001-45110-9399778514238/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204609.756001-45110-9399778514238/AnsiballZ_command.py" <<< 44071 1727204609.83241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204609.83361: stderr chunk (state=3): >>><<< 44071 1727204609.83367: stdout chunk (state=3): >>><<< 44071 1727204609.83370: done transferring module to remote 44071 1727204609.83372: _low_level_execute_command(): starting 44071 1727204609.83375: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204609.756001-45110-9399778514238/ /root/.ansible/tmp/ansible-tmp-1727204609.756001-45110-9399778514238/AnsiballZ_command.py && sleep 0' 44071 1727204609.84312: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204609.84423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204609.84427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204609.84541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204609.86513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204609.86518: stdout chunk (state=3): >>><<< 44071 1727204609.86521: stderr chunk (state=3): >>><<< 44071 1727204609.86558: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204609.86619: _low_level_execute_command(): starting 44071 1727204609.86623: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204609.756001-45110-9399778514238/AnsiballZ_command.py && sleep 0' 44071 1727204609.87310: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204609.87432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204609.87553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204609.87765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204610.04671: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-24 15:03:30.038648", "end": "2024-09-24 15:03:30.045326", "delta": "0:00:00.006678", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204610.06156: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. <<< 44071 1727204610.06243: stderr chunk (state=3): >>><<< 44071 1727204610.06247: stdout chunk (state=3): >>><<< 44071 1727204610.06275: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-24 15:03:30.038648", "end": "2024-09-24 15:03:30.045326", "delta": "0:00:00.006678", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. 44071 1727204610.06311: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204609.756001-45110-9399778514238/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204610.06352: _low_level_execute_command(): starting 44071 1727204610.06357: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204609.756001-45110-9399778514238/ > /dev/null 2>&1 && sleep 0' 44071 1727204610.06958: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204610.06962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204610.06967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204610.06973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204610.07021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204610.07024: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204610.07101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204610.09009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204610.09080: stderr chunk (state=3): >>><<< 44071 1727204610.09084: stdout chunk (state=3): >>><<< 44071 1727204610.09102: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204610.09110: handler run complete 44071 1727204610.09131: Evaluated conditional (False): False 44071 1727204610.09140: attempt loop complete, returning result 44071 1727204610.09152: _execute() done 44071 1727204610.09155: dumping result to json 44071 1727204610.09171: done dumping result, returning 44071 1727204610.09174: done running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary [127b8e07-fff9-c964-7471-00000000063e] 44071 1727204610.09177: sending task result for task 127b8e07-fff9-c964-7471-00000000063e 44071 1727204610.09291: done sending task result for task 127b8e07-fff9-c964-7471-00000000063e 44071 1727204610.09294: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.006678", "end": "2024-09-24 15:03:30.045326", "rc": 1, "start": "2024-09-24 15:03:30.038648" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 44071 1727204610.09387: no more pending results, returning what we have 44071 1727204610.09393: results queue empty 44071 1727204610.09394: checking for any_errors_fatal 44071 1727204610.09395: done checking for any_errors_fatal 44071 1727204610.09396: checking for max_fail_percentage 44071 1727204610.09397: done checking for max_fail_percentage 44071 1727204610.09398: checking to see if all hosts have failed and the running result is not ok 44071 1727204610.09399: done checking to see if all hosts have failed 44071 1727204610.09400: getting the remaining hosts for this loop 44071 1727204610.09401: done getting the remaining hosts for this loop 44071 1727204610.09409: getting the next task for host managed-node2 44071 1727204610.09421: done getting next task for host managed-node2 44071 1727204610.09423: ^ task is: TASK: Include the task 'get_interface_stat.yml' 44071 1727204610.09427: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204610.09432: getting variables 44071 1727204610.09433: in VariableManager get_vars() 44071 1727204610.09495: Calling all_inventory to load vars for managed-node2 44071 1727204610.09499: Calling groups_inventory to load vars for managed-node2 44071 1727204610.09504: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204610.09516: Calling all_plugins_play to load vars for managed-node2 44071 1727204610.09519: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204610.09522: Calling groups_plugins_play to load vars for managed-node2 44071 1727204610.11545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204610.13797: done with get_vars() 44071 1727204610.13839: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.466) 0:00:22.455 ***** 44071 1727204610.13951: entering _queue_task() for managed-node2/include_tasks 44071 1727204610.14366: worker is 1 (out of 1 available) 44071 1727204610.14380: exiting _queue_task() for managed-node2/include_tasks 44071 1727204610.14395: done queuing things up, now waiting for results queue to drain 44071 1727204610.14397: waiting for pending results... 44071 1727204610.14793: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 44071 1727204610.14863: in run() - task 127b8e07-fff9-c964-7471-000000000642 44071 1727204610.14893: variable 'ansible_search_path' from source: unknown 44071 1727204610.14901: variable 'ansible_search_path' from source: unknown 44071 1727204610.14945: calling self._execute() 44071 1727204610.15048: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204610.15061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204610.15078: variable 'omit' from source: magic vars 44071 1727204610.15499: variable 'ansible_distribution_major_version' from source: facts 44071 1727204610.15535: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204610.15539: _execute() done 44071 1727204610.15542: dumping result to json 44071 1727204610.15644: done dumping result, returning 44071 1727204610.15648: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-c964-7471-000000000642] 44071 1727204610.15651: sending task result for task 127b8e07-fff9-c964-7471-000000000642 44071 1727204610.15735: done sending task result for task 127b8e07-fff9-c964-7471-000000000642 44071 1727204610.15738: WORKER PROCESS EXITING 44071 1727204610.15785: no more pending results, returning what we have 44071 1727204610.15792: in VariableManager get_vars() 44071 1727204610.15834: Calling all_inventory to load vars for managed-node2 44071 1727204610.15838: Calling groups_inventory to load vars for managed-node2 44071 1727204610.15841: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204610.15859: Calling all_plugins_play to load vars for managed-node2 44071 1727204610.15862: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204610.15867: Calling groups_plugins_play to load vars for managed-node2 44071 1727204610.18079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204610.20262: done with get_vars() 44071 1727204610.20299: variable 'ansible_search_path' from source: unknown 44071 1727204610.20300: variable 'ansible_search_path' from source: unknown 44071 1727204610.20312: variable 'item' from source: include params 44071 1727204610.20431: variable 'item' from source: include params 44071 1727204610.20472: we have included files to process 44071 1727204610.20473: generating all_blocks data 44071 1727204610.20475: done generating all_blocks data 44071 1727204610.20480: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204610.20481: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204610.20484: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204610.20693: done processing included file 44071 1727204610.20696: iterating over new_blocks loaded from include file 44071 1727204610.20697: in VariableManager get_vars() 44071 1727204610.20716: done with get_vars() 44071 1727204610.20718: filtering new block on tags 44071 1727204610.20748: done filtering new block on tags 44071 1727204610.20750: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 44071 1727204610.20756: extending task lists for all hosts with included blocks 44071 1727204610.20941: done extending task lists 44071 1727204610.20943: done processing included files 44071 1727204610.20944: results queue empty 44071 1727204610.20945: checking for any_errors_fatal 44071 1727204610.20953: done checking for any_errors_fatal 44071 1727204610.20954: checking for max_fail_percentage 44071 1727204610.20956: done checking for max_fail_percentage 44071 1727204610.20956: checking to see if all hosts have failed and the running result is not ok 44071 1727204610.20957: done checking to see if all hosts have failed 44071 1727204610.20958: getting the remaining hosts for this loop 44071 1727204610.20959: done getting the remaining hosts for this loop 44071 1727204610.20962: getting the next task for host managed-node2 44071 1727204610.20969: done getting next task for host managed-node2 44071 1727204610.20971: ^ task is: TASK: Get stat for interface {{ interface }} 44071 1727204610.20975: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204610.20977: getting variables 44071 1727204610.20978: in VariableManager get_vars() 44071 1727204610.20989: Calling all_inventory to load vars for managed-node2 44071 1727204610.20992: Calling groups_inventory to load vars for managed-node2 44071 1727204610.20995: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204610.21002: Calling all_plugins_play to load vars for managed-node2 44071 1727204610.21004: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204610.21008: Calling groups_plugins_play to load vars for managed-node2 44071 1727204610.22619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204610.24812: done with get_vars() 44071 1727204610.24853: done getting variables 44071 1727204610.25175: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.112) 0:00:22.568 ***** 44071 1727204610.25210: entering _queue_task() for managed-node2/stat 44071 1727204610.25698: worker is 1 (out of 1 available) 44071 1727204610.25712: exiting _queue_task() for managed-node2/stat 44071 1727204610.25726: done queuing things up, now waiting for results queue to drain 44071 1727204610.25728: waiting for pending results... 44071 1727204610.26388: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 44071 1727204610.26501: in run() - task 127b8e07-fff9-c964-7471-000000000691 44071 1727204610.26531: variable 'ansible_search_path' from source: unknown 44071 1727204610.26572: variable 'ansible_search_path' from source: unknown 44071 1727204610.26591: calling self._execute() 44071 1727204610.26700: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204610.26713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204610.26731: variable 'omit' from source: magic vars 44071 1727204610.27166: variable 'ansible_distribution_major_version' from source: facts 44071 1727204610.27371: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204610.27375: variable 'omit' from source: magic vars 44071 1727204610.27377: variable 'omit' from source: magic vars 44071 1727204610.27392: variable 'interface' from source: play vars 44071 1727204610.27417: variable 'omit' from source: magic vars 44071 1727204610.27469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204610.27516: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204610.27544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204610.27572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204610.27599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204610.27640: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204610.27650: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204610.27658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204610.27781: Set connection var ansible_connection to ssh 44071 1727204610.27794: Set connection var ansible_timeout to 10 44071 1727204610.27807: Set connection var ansible_pipelining to False 44071 1727204610.27825: Set connection var ansible_shell_type to sh 44071 1727204610.27835: Set connection var ansible_shell_executable to /bin/sh 44071 1727204610.27847: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204610.27879: variable 'ansible_shell_executable' from source: unknown 44071 1727204610.27889: variable 'ansible_connection' from source: unknown 44071 1727204610.27897: variable 'ansible_module_compression' from source: unknown 44071 1727204610.27905: variable 'ansible_shell_type' from source: unknown 44071 1727204610.27922: variable 'ansible_shell_executable' from source: unknown 44071 1727204610.27934: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204610.27946: variable 'ansible_pipelining' from source: unknown 44071 1727204610.27957: variable 'ansible_timeout' from source: unknown 44071 1727204610.28037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204610.28220: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204610.28240: variable 'omit' from source: magic vars 44071 1727204610.28255: starting attempt loop 44071 1727204610.28262: running the handler 44071 1727204610.28283: _low_level_execute_command(): starting 44071 1727204610.28296: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204610.29153: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204610.29208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204610.29360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204610.29518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204610.31292: stdout chunk (state=3): >>>/root <<< 44071 1727204610.31473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204610.31490: stderr chunk (state=3): >>><<< 44071 1727204610.31499: stdout chunk (state=3): >>><<< 44071 1727204610.31654: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204610.31659: _low_level_execute_command(): starting 44071 1727204610.31663: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204610.3153534-45138-237863013884931 `" && echo ansible-tmp-1727204610.3153534-45138-237863013884931="` echo /root/.ansible/tmp/ansible-tmp-1727204610.3153534-45138-237863013884931 `" ) && sleep 0' 44071 1727204610.32277: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204610.32295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204610.32335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204610.32348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204610.32446: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204610.32473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204610.32496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204610.32600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204610.34630: stdout chunk (state=3): >>>ansible-tmp-1727204610.3153534-45138-237863013884931=/root/.ansible/tmp/ansible-tmp-1727204610.3153534-45138-237863013884931 <<< 44071 1727204610.34973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204610.34980: stderr chunk (state=3): >>><<< 44071 1727204610.34983: stdout chunk (state=3): >>><<< 44071 1727204610.35089: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204610.3153534-45138-237863013884931=/root/.ansible/tmp/ansible-tmp-1727204610.3153534-45138-237863013884931 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204610.35348: variable 'ansible_module_compression' from source: unknown 44071 1727204610.35433: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44071 1727204610.35849: variable 'ansible_facts' from source: unknown 44071 1727204610.36475: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204610.3153534-45138-237863013884931/AnsiballZ_stat.py 44071 1727204610.37155: Sending initial data 44071 1727204610.37159: Sent initial data (153 bytes) 44071 1727204610.38843: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204610.39283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204610.39303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204610.39709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204610.41380: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204610.41448: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204610.41518: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpi1iy_xav /root/.ansible/tmp/ansible-tmp-1727204610.3153534-45138-237863013884931/AnsiballZ_stat.py <<< 44071 1727204610.41533: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204610.3153534-45138-237863013884931/AnsiballZ_stat.py" <<< 44071 1727204610.41598: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpi1iy_xav" to remote "/root/.ansible/tmp/ansible-tmp-1727204610.3153534-45138-237863013884931/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204610.3153534-45138-237863013884931/AnsiballZ_stat.py" <<< 44071 1727204610.43516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204610.43620: stderr chunk (state=3): >>><<< 44071 1727204610.43624: stdout chunk (state=3): >>><<< 44071 1727204610.43653: done transferring module to remote 44071 1727204610.43669: _low_level_execute_command(): starting 44071 1727204610.43673: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204610.3153534-45138-237863013884931/ /root/.ansible/tmp/ansible-tmp-1727204610.3153534-45138-237863013884931/AnsiballZ_stat.py && sleep 0' 44071 1727204610.45876: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204610.45881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204610.46001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204610.46022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204610.46035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204610.46136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204610.48424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204610.48429: stdout chunk (state=3): >>><<< 44071 1727204610.48446: stderr chunk (state=3): >>><<< 44071 1727204610.48502: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204610.48515: _low_level_execute_command(): starting 44071 1727204610.48519: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204610.3153534-45138-237863013884931/AnsiballZ_stat.py && sleep 0' 44071 1727204610.49998: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204610.50004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204610.50489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204610.50601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204610.50695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204610.67197: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44071 1727204610.68584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204610.68588: stdout chunk (state=3): >>><<< 44071 1727204610.68591: stderr chunk (state=3): >>><<< 44071 1727204610.68745: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204610.68749: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204610.3153534-45138-237863013884931/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204610.68752: _low_level_execute_command(): starting 44071 1727204610.68754: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204610.3153534-45138-237863013884931/ > /dev/null 2>&1 && sleep 0' 44071 1727204610.70212: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204610.70227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204610.70310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204610.70440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204610.70510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204610.72515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204610.72575: stdout chunk (state=3): >>><<< 44071 1727204610.72579: stderr chunk (state=3): >>><<< 44071 1727204610.72692: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204610.72698: handler run complete 44071 1727204610.72727: attempt loop complete, returning result 44071 1727204610.72749: _execute() done 44071 1727204610.72753: dumping result to json 44071 1727204610.72756: done dumping result, returning 44071 1727204610.72758: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [127b8e07-fff9-c964-7471-000000000691] 44071 1727204610.72760: sending task result for task 127b8e07-fff9-c964-7471-000000000691 44071 1727204610.72945: done sending task result for task 127b8e07-fff9-c964-7471-000000000691 44071 1727204610.72949: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 44071 1727204610.73049: no more pending results, returning what we have 44071 1727204610.73053: results queue empty 44071 1727204610.73054: checking for any_errors_fatal 44071 1727204610.73056: done checking for any_errors_fatal 44071 1727204610.73057: checking for max_fail_percentage 44071 1727204610.73058: done checking for max_fail_percentage 44071 1727204610.73059: checking to see if all hosts have failed and the running result is not ok 44071 1727204610.73060: done checking to see if all hosts have failed 44071 1727204610.73061: getting the remaining hosts for this loop 44071 1727204610.73063: done getting the remaining hosts for this loop 44071 1727204610.73070: getting the next task for host managed-node2 44071 1727204610.73079: done getting next task for host managed-node2 44071 1727204610.73082: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 44071 1727204610.73086: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204610.73091: getting variables 44071 1727204610.73093: in VariableManager get_vars() 44071 1727204610.73126: Calling all_inventory to load vars for managed-node2 44071 1727204610.73128: Calling groups_inventory to load vars for managed-node2 44071 1727204610.73131: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204610.73147: Calling all_plugins_play to load vars for managed-node2 44071 1727204610.73150: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204610.73152: Calling groups_plugins_play to load vars for managed-node2 44071 1727204610.75288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204610.77714: done with get_vars() 44071 1727204610.77764: done getting variables 44071 1727204610.77833: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204610.77984: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.528) 0:00:23.096 ***** 44071 1727204610.78022: entering _queue_task() for managed-node2/assert 44071 1727204610.78509: worker is 1 (out of 1 available) 44071 1727204610.78635: exiting _queue_task() for managed-node2/assert 44071 1727204610.78651: done queuing things up, now waiting for results queue to drain 44071 1727204610.78653: waiting for pending results... 44071 1727204610.79092: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' 44071 1727204610.79098: in run() - task 127b8e07-fff9-c964-7471-000000000643 44071 1727204610.79102: variable 'ansible_search_path' from source: unknown 44071 1727204610.79105: variable 'ansible_search_path' from source: unknown 44071 1727204610.79108: calling self._execute() 44071 1727204610.79172: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204610.79198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204610.79215: variable 'omit' from source: magic vars 44071 1727204610.79686: variable 'ansible_distribution_major_version' from source: facts 44071 1727204610.79707: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204610.79719: variable 'omit' from source: magic vars 44071 1727204610.79790: variable 'omit' from source: magic vars 44071 1727204610.79918: variable 'interface' from source: play vars 44071 1727204610.79954: variable 'omit' from source: magic vars 44071 1727204610.80007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204610.80062: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204610.80096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204610.80120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204610.80141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204610.80193: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204610.80274: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204610.80282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204610.80342: Set connection var ansible_connection to ssh 44071 1727204610.80357: Set connection var ansible_timeout to 10 44071 1727204610.80370: Set connection var ansible_pipelining to False 44071 1727204610.80392: Set connection var ansible_shell_type to sh 44071 1727204610.80404: Set connection var ansible_shell_executable to /bin/sh 44071 1727204610.80416: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204610.80449: variable 'ansible_shell_executable' from source: unknown 44071 1727204610.80457: variable 'ansible_connection' from source: unknown 44071 1727204610.80464: variable 'ansible_module_compression' from source: unknown 44071 1727204610.80474: variable 'ansible_shell_type' from source: unknown 44071 1727204610.80481: variable 'ansible_shell_executable' from source: unknown 44071 1727204610.80499: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204610.80571: variable 'ansible_pipelining' from source: unknown 44071 1727204610.80574: variable 'ansible_timeout' from source: unknown 44071 1727204610.80577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204610.80773: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204610.80776: variable 'omit' from source: magic vars 44071 1727204610.80779: starting attempt loop 44071 1727204610.80782: running the handler 44071 1727204610.80954: variable 'interface_stat' from source: set_fact 44071 1727204610.80976: Evaluated conditional (not interface_stat.stat.exists): True 44071 1727204610.80987: handler run complete 44071 1727204610.81007: attempt loop complete, returning result 44071 1727204610.81019: _execute() done 44071 1727204610.81034: dumping result to json 44071 1727204610.81046: done dumping result, returning 44071 1727204610.81128: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' [127b8e07-fff9-c964-7471-000000000643] 44071 1727204610.81132: sending task result for task 127b8e07-fff9-c964-7471-000000000643 44071 1727204610.81225: done sending task result for task 127b8e07-fff9-c964-7471-000000000643 44071 1727204610.81228: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204610.81296: no more pending results, returning what we have 44071 1727204610.81300: results queue empty 44071 1727204610.81301: checking for any_errors_fatal 44071 1727204610.81313: done checking for any_errors_fatal 44071 1727204610.81314: checking for max_fail_percentage 44071 1727204610.81316: done checking for max_fail_percentage 44071 1727204610.81317: checking to see if all hosts have failed and the running result is not ok 44071 1727204610.81318: done checking to see if all hosts have failed 44071 1727204610.81319: getting the remaining hosts for this loop 44071 1727204610.81320: done getting the remaining hosts for this loop 44071 1727204610.81326: getting the next task for host managed-node2 44071 1727204610.81335: done getting next task for host managed-node2 44071 1727204610.81338: ^ task is: TASK: Test 44071 1727204610.81345: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204610.81349: getting variables 44071 1727204610.81351: in VariableManager get_vars() 44071 1727204610.81392: Calling all_inventory to load vars for managed-node2 44071 1727204610.81395: Calling groups_inventory to load vars for managed-node2 44071 1727204610.81399: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204610.81413: Calling all_plugins_play to load vars for managed-node2 44071 1727204610.81417: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204610.81420: Calling groups_plugins_play to load vars for managed-node2 44071 1727204610.82915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204610.84096: done with get_vars() 44071 1727204610.84122: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.061) 0:00:23.158 ***** 44071 1727204610.84206: entering _queue_task() for managed-node2/include_tasks 44071 1727204610.84559: worker is 1 (out of 1 available) 44071 1727204610.84580: exiting _queue_task() for managed-node2/include_tasks 44071 1727204610.84596: done queuing things up, now waiting for results queue to drain 44071 1727204610.84598: waiting for pending results... 44071 1727204610.84852: running TaskExecutor() for managed-node2/TASK: Test 44071 1727204610.84980: in run() - task 127b8e07-fff9-c964-7471-0000000005b8 44071 1727204610.85001: variable 'ansible_search_path' from source: unknown 44071 1727204610.85008: variable 'ansible_search_path' from source: unknown 44071 1727204610.85068: variable 'lsr_test' from source: include params 44071 1727204610.85299: variable 'lsr_test' from source: include params 44071 1727204610.85363: variable 'omit' from source: magic vars 44071 1727204610.85492: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204610.85496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204610.85532: variable 'omit' from source: magic vars 44071 1727204610.85845: variable 'ansible_distribution_major_version' from source: facts 44071 1727204610.85849: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204610.85851: variable 'item' from source: unknown 44071 1727204610.85854: variable 'item' from source: unknown 44071 1727204610.86075: variable 'item' from source: unknown 44071 1727204610.86078: variable 'item' from source: unknown 44071 1727204610.86185: dumping result to json 44071 1727204610.86188: done dumping result, returning 44071 1727204610.86190: done running TaskExecutor() for managed-node2/TASK: Test [127b8e07-fff9-c964-7471-0000000005b8] 44071 1727204610.86192: sending task result for task 127b8e07-fff9-c964-7471-0000000005b8 44071 1727204610.86234: done sending task result for task 127b8e07-fff9-c964-7471-0000000005b8 44071 1727204610.86237: WORKER PROCESS EXITING 44071 1727204610.86259: no more pending results, returning what we have 44071 1727204610.86263: in VariableManager get_vars() 44071 1727204610.86298: Calling all_inventory to load vars for managed-node2 44071 1727204610.86301: Calling groups_inventory to load vars for managed-node2 44071 1727204610.86304: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204610.86315: Calling all_plugins_play to load vars for managed-node2 44071 1727204610.86317: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204610.86320: Calling groups_plugins_play to load vars for managed-node2 44071 1727204610.87701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204610.88905: done with get_vars() 44071 1727204610.88931: variable 'ansible_search_path' from source: unknown 44071 1727204610.88932: variable 'ansible_search_path' from source: unknown 44071 1727204610.88970: we have included files to process 44071 1727204610.88972: generating all_blocks data 44071 1727204610.88974: done generating all_blocks data 44071 1727204610.88978: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 44071 1727204610.88979: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 44071 1727204610.88981: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 44071 1727204610.89224: done processing included file 44071 1727204610.89226: iterating over new_blocks loaded from include file 44071 1727204610.89227: in VariableManager get_vars() 44071 1727204610.89239: done with get_vars() 44071 1727204610.89240: filtering new block on tags 44071 1727204610.89265: done filtering new block on tags 44071 1727204610.89268: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml for managed-node2 => (item=tasks/create_bridge_profile_no_autoconnect.yml) 44071 1727204610.89272: extending task lists for all hosts with included blocks 44071 1727204610.89890: done extending task lists 44071 1727204610.89892: done processing included files 44071 1727204610.89893: results queue empty 44071 1727204610.89894: checking for any_errors_fatal 44071 1727204610.89898: done checking for any_errors_fatal 44071 1727204610.89899: checking for max_fail_percentage 44071 1727204610.89900: done checking for max_fail_percentage 44071 1727204610.89901: checking to see if all hosts have failed and the running result is not ok 44071 1727204610.89902: done checking to see if all hosts have failed 44071 1727204610.89902: getting the remaining hosts for this loop 44071 1727204610.89904: done getting the remaining hosts for this loop 44071 1727204610.89906: getting the next task for host managed-node2 44071 1727204610.89911: done getting next task for host managed-node2 44071 1727204610.89913: ^ task is: TASK: Include network role 44071 1727204610.89916: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204610.89919: getting variables 44071 1727204610.89920: in VariableManager get_vars() 44071 1727204610.89933: Calling all_inventory to load vars for managed-node2 44071 1727204610.89936: Calling groups_inventory to load vars for managed-node2 44071 1727204610.89939: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204610.89946: Calling all_plugins_play to load vars for managed-node2 44071 1727204610.89949: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204610.89952: Calling groups_plugins_play to load vars for managed-node2 44071 1727204610.91432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204610.92647: done with get_vars() 44071 1727204610.92680: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:3 Tuesday 24 September 2024 15:03:30 -0400 (0:00:00.085) 0:00:23.243 ***** 44071 1727204610.92748: entering _queue_task() for managed-node2/include_role 44071 1727204610.93049: worker is 1 (out of 1 available) 44071 1727204610.93066: exiting _queue_task() for managed-node2/include_role 44071 1727204610.93081: done queuing things up, now waiting for results queue to drain 44071 1727204610.93083: waiting for pending results... 44071 1727204610.93280: running TaskExecutor() for managed-node2/TASK: Include network role 44071 1727204610.93370: in run() - task 127b8e07-fff9-c964-7471-0000000006b1 44071 1727204610.93384: variable 'ansible_search_path' from source: unknown 44071 1727204610.93388: variable 'ansible_search_path' from source: unknown 44071 1727204610.93421: calling self._execute() 44071 1727204610.93499: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204610.93503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204610.93513: variable 'omit' from source: magic vars 44071 1727204610.93828: variable 'ansible_distribution_major_version' from source: facts 44071 1727204610.93841: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204610.93846: _execute() done 44071 1727204610.93849: dumping result to json 44071 1727204610.93853: done dumping result, returning 44071 1727204610.93863: done running TaskExecutor() for managed-node2/TASK: Include network role [127b8e07-fff9-c964-7471-0000000006b1] 44071 1727204610.93866: sending task result for task 127b8e07-fff9-c964-7471-0000000006b1 44071 1727204610.93995: done sending task result for task 127b8e07-fff9-c964-7471-0000000006b1 44071 1727204610.93998: WORKER PROCESS EXITING 44071 1727204610.94031: no more pending results, returning what we have 44071 1727204610.94037: in VariableManager get_vars() 44071 1727204610.94080: Calling all_inventory to load vars for managed-node2 44071 1727204610.94083: Calling groups_inventory to load vars for managed-node2 44071 1727204610.94086: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204610.94100: Calling all_plugins_play to load vars for managed-node2 44071 1727204610.94103: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204610.94106: Calling groups_plugins_play to load vars for managed-node2 44071 1727204611.01196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204611.02761: done with get_vars() 44071 1727204611.02788: variable 'ansible_search_path' from source: unknown 44071 1727204611.02790: variable 'ansible_search_path' from source: unknown 44071 1727204611.02958: variable 'omit' from source: magic vars 44071 1727204611.02989: variable 'omit' from source: magic vars 44071 1727204611.02999: variable 'omit' from source: magic vars 44071 1727204611.03002: we have included files to process 44071 1727204611.03003: generating all_blocks data 44071 1727204611.03004: done generating all_blocks data 44071 1727204611.03005: processing included file: fedora.linux_system_roles.network 44071 1727204611.03024: in VariableManager get_vars() 44071 1727204611.03043: done with get_vars() 44071 1727204611.03075: in VariableManager get_vars() 44071 1727204611.03091: done with get_vars() 44071 1727204611.03129: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44071 1727204611.03252: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44071 1727204611.03341: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44071 1727204611.04128: in VariableManager get_vars() 44071 1727204611.04157: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204611.05603: iterating over new_blocks loaded from include file 44071 1727204611.05605: in VariableManager get_vars() 44071 1727204611.05623: done with get_vars() 44071 1727204611.05624: filtering new block on tags 44071 1727204611.05818: done filtering new block on tags 44071 1727204611.05821: in VariableManager get_vars() 44071 1727204611.05832: done with get_vars() 44071 1727204611.05833: filtering new block on tags 44071 1727204611.05847: done filtering new block on tags 44071 1727204611.05849: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 44071 1727204611.05854: extending task lists for all hosts with included blocks 44071 1727204611.05991: done extending task lists 44071 1727204611.05992: done processing included files 44071 1727204611.05993: results queue empty 44071 1727204611.05993: checking for any_errors_fatal 44071 1727204611.05996: done checking for any_errors_fatal 44071 1727204611.05996: checking for max_fail_percentage 44071 1727204611.05997: done checking for max_fail_percentage 44071 1727204611.05997: checking to see if all hosts have failed and the running result is not ok 44071 1727204611.05998: done checking to see if all hosts have failed 44071 1727204611.05999: getting the remaining hosts for this loop 44071 1727204611.06000: done getting the remaining hosts for this loop 44071 1727204611.06001: getting the next task for host managed-node2 44071 1727204611.06004: done getting next task for host managed-node2 44071 1727204611.06006: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204611.06008: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204611.06016: getting variables 44071 1727204611.06017: in VariableManager get_vars() 44071 1727204611.06028: Calling all_inventory to load vars for managed-node2 44071 1727204611.06029: Calling groups_inventory to load vars for managed-node2 44071 1727204611.06031: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204611.06035: Calling all_plugins_play to load vars for managed-node2 44071 1727204611.06036: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204611.06038: Calling groups_plugins_play to load vars for managed-node2 44071 1727204611.07000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204611.09141: done with get_vars() 44071 1727204611.09183: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.165) 0:00:23.409 ***** 44071 1727204611.09282: entering _queue_task() for managed-node2/include_tasks 44071 1727204611.09695: worker is 1 (out of 1 available) 44071 1727204611.09711: exiting _queue_task() for managed-node2/include_tasks 44071 1727204611.09728: done queuing things up, now waiting for results queue to drain 44071 1727204611.09730: waiting for pending results... 44071 1727204611.10198: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204611.10280: in run() - task 127b8e07-fff9-c964-7471-00000000072f 44071 1727204611.10310: variable 'ansible_search_path' from source: unknown 44071 1727204611.10318: variable 'ansible_search_path' from source: unknown 44071 1727204611.10374: calling self._execute() 44071 1727204611.10508: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204611.10571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204611.10575: variable 'omit' from source: magic vars 44071 1727204611.11004: variable 'ansible_distribution_major_version' from source: facts 44071 1727204611.11025: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204611.11037: _execute() done 44071 1727204611.11050: dumping result to json 44071 1727204611.11059: done dumping result, returning 44071 1727204611.11074: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-c964-7471-00000000072f] 44071 1727204611.11085: sending task result for task 127b8e07-fff9-c964-7471-00000000072f 44071 1727204611.11262: no more pending results, returning what we have 44071 1727204611.11271: in VariableManager get_vars() 44071 1727204611.11322: Calling all_inventory to load vars for managed-node2 44071 1727204611.11325: Calling groups_inventory to load vars for managed-node2 44071 1727204611.11328: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204611.11345: Calling all_plugins_play to load vars for managed-node2 44071 1727204611.11349: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204611.11352: Calling groups_plugins_play to load vars for managed-node2 44071 1727204611.12386: done sending task result for task 127b8e07-fff9-c964-7471-00000000072f 44071 1727204611.12390: WORKER PROCESS EXITING 44071 1727204611.13254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204611.14521: done with get_vars() 44071 1727204611.14548: variable 'ansible_search_path' from source: unknown 44071 1727204611.14549: variable 'ansible_search_path' from source: unknown 44071 1727204611.14587: we have included files to process 44071 1727204611.14588: generating all_blocks data 44071 1727204611.14590: done generating all_blocks data 44071 1727204611.14592: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204611.14593: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204611.14595: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204611.15069: done processing included file 44071 1727204611.15072: iterating over new_blocks loaded from include file 44071 1727204611.15073: in VariableManager get_vars() 44071 1727204611.15106: done with get_vars() 44071 1727204611.15108: filtering new block on tags 44071 1727204611.15141: done filtering new block on tags 44071 1727204611.15143: in VariableManager get_vars() 44071 1727204611.15172: done with get_vars() 44071 1727204611.15175: filtering new block on tags 44071 1727204611.15212: done filtering new block on tags 44071 1727204611.15215: in VariableManager get_vars() 44071 1727204611.15239: done with get_vars() 44071 1727204611.15241: filtering new block on tags 44071 1727204611.15272: done filtering new block on tags 44071 1727204611.15274: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 44071 1727204611.15278: extending task lists for all hosts with included blocks 44071 1727204611.17200: done extending task lists 44071 1727204611.17202: done processing included files 44071 1727204611.17203: results queue empty 44071 1727204611.17203: checking for any_errors_fatal 44071 1727204611.17208: done checking for any_errors_fatal 44071 1727204611.17208: checking for max_fail_percentage 44071 1727204611.17210: done checking for max_fail_percentage 44071 1727204611.17210: checking to see if all hosts have failed and the running result is not ok 44071 1727204611.17211: done checking to see if all hosts have failed 44071 1727204611.17212: getting the remaining hosts for this loop 44071 1727204611.17214: done getting the remaining hosts for this loop 44071 1727204611.17217: getting the next task for host managed-node2 44071 1727204611.17223: done getting next task for host managed-node2 44071 1727204611.17227: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204611.17231: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204611.17243: getting variables 44071 1727204611.17244: in VariableManager get_vars() 44071 1727204611.17266: Calling all_inventory to load vars for managed-node2 44071 1727204611.17270: Calling groups_inventory to load vars for managed-node2 44071 1727204611.17272: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204611.17282: Calling all_plugins_play to load vars for managed-node2 44071 1727204611.17285: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204611.17289: Calling groups_plugins_play to load vars for managed-node2 44071 1727204611.18459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204611.19758: done with get_vars() 44071 1727204611.19790: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.105) 0:00:23.515 ***** 44071 1727204611.19881: entering _queue_task() for managed-node2/setup 44071 1727204611.20220: worker is 1 (out of 1 available) 44071 1727204611.20234: exiting _queue_task() for managed-node2/setup 44071 1727204611.20248: done queuing things up, now waiting for results queue to drain 44071 1727204611.20250: waiting for pending results... 44071 1727204611.20501: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204611.20810: in run() - task 127b8e07-fff9-c964-7471-00000000078c 44071 1727204611.20815: variable 'ansible_search_path' from source: unknown 44071 1727204611.20817: variable 'ansible_search_path' from source: unknown 44071 1727204611.20822: calling self._execute() 44071 1727204611.20916: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204611.20920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204611.20924: variable 'omit' from source: magic vars 44071 1727204611.21306: variable 'ansible_distribution_major_version' from source: facts 44071 1727204611.21319: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204611.21563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204611.25146: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204611.25208: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204611.25233: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204611.25312: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204611.25316: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204611.25383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204611.25419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204611.25440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204611.25530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204611.25535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204611.25548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204611.25572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204611.25598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204611.25641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204611.25652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204611.25807: variable '__network_required_facts' from source: role '' defaults 44071 1727204611.25816: variable 'ansible_facts' from source: unknown 44071 1727204611.27280: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44071 1727204611.27285: when evaluation is False, skipping this task 44071 1727204611.27287: _execute() done 44071 1727204611.27289: dumping result to json 44071 1727204611.27291: done dumping result, returning 44071 1727204611.27293: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-c964-7471-00000000078c] 44071 1727204611.27295: sending task result for task 127b8e07-fff9-c964-7471-00000000078c 44071 1727204611.27371: done sending task result for task 127b8e07-fff9-c964-7471-00000000078c 44071 1727204611.27374: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204611.27427: no more pending results, returning what we have 44071 1727204611.27431: results queue empty 44071 1727204611.27432: checking for any_errors_fatal 44071 1727204611.27433: done checking for any_errors_fatal 44071 1727204611.27434: checking for max_fail_percentage 44071 1727204611.27435: done checking for max_fail_percentage 44071 1727204611.27436: checking to see if all hosts have failed and the running result is not ok 44071 1727204611.27437: done checking to see if all hosts have failed 44071 1727204611.27438: getting the remaining hosts for this loop 44071 1727204611.27442: done getting the remaining hosts for this loop 44071 1727204611.27447: getting the next task for host managed-node2 44071 1727204611.27458: done getting next task for host managed-node2 44071 1727204611.27462: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204611.27470: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204611.27491: getting variables 44071 1727204611.27492: in VariableManager get_vars() 44071 1727204611.27531: Calling all_inventory to load vars for managed-node2 44071 1727204611.27534: Calling groups_inventory to load vars for managed-node2 44071 1727204611.27536: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204611.27550: Calling all_plugins_play to load vars for managed-node2 44071 1727204611.27553: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204611.27563: Calling groups_plugins_play to load vars for managed-node2 44071 1727204611.32571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204611.37455: done with get_vars() 44071 1727204611.37588: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.179) 0:00:23.695 ***** 44071 1727204611.37878: entering _queue_task() for managed-node2/stat 44071 1727204611.38769: worker is 1 (out of 1 available) 44071 1727204611.38787: exiting _queue_task() for managed-node2/stat 44071 1727204611.38808: done queuing things up, now waiting for results queue to drain 44071 1727204611.38810: waiting for pending results... 44071 1727204611.39309: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204611.39492: in run() - task 127b8e07-fff9-c964-7471-00000000078e 44071 1727204611.39572: variable 'ansible_search_path' from source: unknown 44071 1727204611.39576: variable 'ansible_search_path' from source: unknown 44071 1727204611.39715: calling self._execute() 44071 1727204611.39944: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204611.39949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204611.40080: variable 'omit' from source: magic vars 44071 1727204611.40882: variable 'ansible_distribution_major_version' from source: facts 44071 1727204611.40891: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204611.41430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204611.41931: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204611.41987: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204611.42025: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204611.42061: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204611.42531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204611.42554: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204611.42585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204611.43148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204611.43256: variable '__network_is_ostree' from source: set_fact 44071 1727204611.43272: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204611.43276: when evaluation is False, skipping this task 44071 1727204611.43278: _execute() done 44071 1727204611.43470: dumping result to json 44071 1727204611.43474: done dumping result, returning 44071 1727204611.43478: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-c964-7471-00000000078e] 44071 1727204611.43480: sending task result for task 127b8e07-fff9-c964-7471-00000000078e 44071 1727204611.43554: done sending task result for task 127b8e07-fff9-c964-7471-00000000078e 44071 1727204611.43557: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204611.43624: no more pending results, returning what we have 44071 1727204611.43629: results queue empty 44071 1727204611.43630: checking for any_errors_fatal 44071 1727204611.43642: done checking for any_errors_fatal 44071 1727204611.43643: checking for max_fail_percentage 44071 1727204611.43645: done checking for max_fail_percentage 44071 1727204611.43646: checking to see if all hosts have failed and the running result is not ok 44071 1727204611.43647: done checking to see if all hosts have failed 44071 1727204611.43648: getting the remaining hosts for this loop 44071 1727204611.43649: done getting the remaining hosts for this loop 44071 1727204611.43655: getting the next task for host managed-node2 44071 1727204611.43675: done getting next task for host managed-node2 44071 1727204611.43680: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204611.43688: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204611.43709: getting variables 44071 1727204611.43711: in VariableManager get_vars() 44071 1727204611.43755: Calling all_inventory to load vars for managed-node2 44071 1727204611.43758: Calling groups_inventory to load vars for managed-node2 44071 1727204611.43761: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204611.43980: Calling all_plugins_play to load vars for managed-node2 44071 1727204611.43984: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204611.43989: Calling groups_plugins_play to load vars for managed-node2 44071 1727204611.47867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204611.53591: done with get_vars() 44071 1727204611.53627: done getting variables 44071 1727204611.53812: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.159) 0:00:23.855 ***** 44071 1727204611.53856: entering _queue_task() for managed-node2/set_fact 44071 1727204611.54783: worker is 1 (out of 1 available) 44071 1727204611.54799: exiting _queue_task() for managed-node2/set_fact 44071 1727204611.54813: done queuing things up, now waiting for results queue to drain 44071 1727204611.54814: waiting for pending results... 44071 1727204611.55685: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204611.56177: in run() - task 127b8e07-fff9-c964-7471-00000000078f 44071 1727204611.56225: variable 'ansible_search_path' from source: unknown 44071 1727204611.56395: variable 'ansible_search_path' from source: unknown 44071 1727204611.56399: calling self._execute() 44071 1727204611.56887: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204611.56892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204611.56896: variable 'omit' from source: magic vars 44071 1727204611.58191: variable 'ansible_distribution_major_version' from source: facts 44071 1727204611.58287: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204611.59176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204611.60087: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204611.60184: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204611.60388: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204611.60442: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204611.60696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204611.60734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204611.60909: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204611.60913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204611.61146: variable '__network_is_ostree' from source: set_fact 44071 1727204611.61160: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204611.61171: when evaluation is False, skipping this task 44071 1727204611.61179: _execute() done 44071 1727204611.61346: dumping result to json 44071 1727204611.61350: done dumping result, returning 44071 1727204611.61352: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-00000000078f] 44071 1727204611.61354: sending task result for task 127b8e07-fff9-c964-7471-00000000078f 44071 1727204611.61434: done sending task result for task 127b8e07-fff9-c964-7471-00000000078f 44071 1727204611.61437: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204611.61506: no more pending results, returning what we have 44071 1727204611.61511: results queue empty 44071 1727204611.61512: checking for any_errors_fatal 44071 1727204611.61520: done checking for any_errors_fatal 44071 1727204611.61521: checking for max_fail_percentage 44071 1727204611.61522: done checking for max_fail_percentage 44071 1727204611.61523: checking to see if all hosts have failed and the running result is not ok 44071 1727204611.61524: done checking to see if all hosts have failed 44071 1727204611.61525: getting the remaining hosts for this loop 44071 1727204611.61527: done getting the remaining hosts for this loop 44071 1727204611.61533: getting the next task for host managed-node2 44071 1727204611.61548: done getting next task for host managed-node2 44071 1727204611.61553: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204611.61559: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204611.61580: getting variables 44071 1727204611.61583: in VariableManager get_vars() 44071 1727204611.61624: Calling all_inventory to load vars for managed-node2 44071 1727204611.61626: Calling groups_inventory to load vars for managed-node2 44071 1727204611.61628: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204611.61642: Calling all_plugins_play to load vars for managed-node2 44071 1727204611.61645: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204611.61648: Calling groups_plugins_play to load vars for managed-node2 44071 1727204611.67132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204611.70259: done with get_vars() 44071 1727204611.70302: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.166) 0:00:24.021 ***** 44071 1727204611.70507: entering _queue_task() for managed-node2/service_facts 44071 1727204611.71545: worker is 1 (out of 1 available) 44071 1727204611.71561: exiting _queue_task() for managed-node2/service_facts 44071 1727204611.71580: done queuing things up, now waiting for results queue to drain 44071 1727204611.71582: waiting for pending results... 44071 1727204611.72086: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204611.72643: in run() - task 127b8e07-fff9-c964-7471-000000000791 44071 1727204611.72649: variable 'ansible_search_path' from source: unknown 44071 1727204611.72652: variable 'ansible_search_path' from source: unknown 44071 1727204611.72654: calling self._execute() 44071 1727204611.72657: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204611.72660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204611.72662: variable 'omit' from source: magic vars 44071 1727204611.73549: variable 'ansible_distribution_major_version' from source: facts 44071 1727204611.73767: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204611.73774: variable 'omit' from source: magic vars 44071 1727204611.73895: variable 'omit' from source: magic vars 44071 1727204611.73944: variable 'omit' from source: magic vars 44071 1727204611.74031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204611.74132: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204611.74257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204611.74295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204611.74387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204611.74445: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204611.74470: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204611.74479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204611.74685: Set connection var ansible_connection to ssh 44071 1727204611.74698: Set connection var ansible_timeout to 10 44071 1727204611.74708: Set connection var ansible_pipelining to False 44071 1727204611.74718: Set connection var ansible_shell_type to sh 44071 1727204611.74728: Set connection var ansible_shell_executable to /bin/sh 44071 1727204611.74856: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204611.74859: variable 'ansible_shell_executable' from source: unknown 44071 1727204611.74862: variable 'ansible_connection' from source: unknown 44071 1727204611.74867: variable 'ansible_module_compression' from source: unknown 44071 1727204611.74870: variable 'ansible_shell_type' from source: unknown 44071 1727204611.74872: variable 'ansible_shell_executable' from source: unknown 44071 1727204611.74875: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204611.74877: variable 'ansible_pipelining' from source: unknown 44071 1727204611.74879: variable 'ansible_timeout' from source: unknown 44071 1727204611.74882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204611.75079: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204611.75097: variable 'omit' from source: magic vars 44071 1727204611.75107: starting attempt loop 44071 1727204611.75114: running the handler 44071 1727204611.75134: _low_level_execute_command(): starting 44071 1727204611.75151: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204611.75927: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204611.75948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204611.75964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204611.76081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204611.76103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204611.76210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204611.77997: stdout chunk (state=3): >>>/root <<< 44071 1727204611.78174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204611.78187: stdout chunk (state=3): >>><<< 44071 1727204611.78418: stderr chunk (state=3): >>><<< 44071 1727204611.78424: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204611.78427: _low_level_execute_command(): starting 44071 1727204611.78431: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204611.7830782-45204-244484853932438 `" && echo ansible-tmp-1727204611.7830782-45204-244484853932438="` echo /root/.ansible/tmp/ansible-tmp-1727204611.7830782-45204-244484853932438 `" ) && sleep 0' 44071 1727204611.79673: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204611.79815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204611.79818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204611.79821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204611.79831: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204611.79834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204611.79836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204611.79992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204611.80126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204611.80194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204611.82193: stdout chunk (state=3): >>>ansible-tmp-1727204611.7830782-45204-244484853932438=/root/.ansible/tmp/ansible-tmp-1727204611.7830782-45204-244484853932438 <<< 44071 1727204611.82307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204611.82523: stderr chunk (state=3): >>><<< 44071 1727204611.82527: stdout chunk (state=3): >>><<< 44071 1727204611.82547: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204611.7830782-45204-244484853932438=/root/.ansible/tmp/ansible-tmp-1727204611.7830782-45204-244484853932438 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204611.82772: variable 'ansible_module_compression' from source: unknown 44071 1727204611.82776: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44071 1727204611.83172: variable 'ansible_facts' from source: unknown 44071 1727204611.83176: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204611.7830782-45204-244484853932438/AnsiballZ_service_facts.py 44071 1727204611.83461: Sending initial data 44071 1727204611.83467: Sent initial data (162 bytes) 44071 1727204611.84985: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204611.85021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204611.85061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204611.85077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204611.85208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204611.85592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204611.85995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204611.87307: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204611.87404: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204611.87551: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp6lb70fm8 /root/.ansible/tmp/ansible-tmp-1727204611.7830782-45204-244484853932438/AnsiballZ_service_facts.py <<< 44071 1727204611.87555: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204611.7830782-45204-244484853932438/AnsiballZ_service_facts.py" <<< 44071 1727204611.87713: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp6lb70fm8" to remote "/root/.ansible/tmp/ansible-tmp-1727204611.7830782-45204-244484853932438/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204611.7830782-45204-244484853932438/AnsiballZ_service_facts.py" <<< 44071 1727204611.89262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204611.89447: stderr chunk (state=3): >>><<< 44071 1727204611.89457: stdout chunk (state=3): >>><<< 44071 1727204611.89487: done transferring module to remote 44071 1727204611.89771: _low_level_execute_command(): starting 44071 1727204611.89776: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204611.7830782-45204-244484853932438/ /root/.ansible/tmp/ansible-tmp-1727204611.7830782-45204-244484853932438/AnsiballZ_service_facts.py && sleep 0' 44071 1727204611.90985: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204611.90990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204611.91006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204611.91184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204611.91218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204611.91369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204611.93214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204611.93331: stderr chunk (state=3): >>><<< 44071 1727204611.93582: stdout chunk (state=3): >>><<< 44071 1727204611.93586: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204611.93588: _low_level_execute_command(): starting 44071 1727204611.93590: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204611.7830782-45204-244484853932438/AnsiballZ_service_facts.py && sleep 0' 44071 1727204611.94493: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204611.94672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204611.94676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204611.94679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204611.94681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204611.94683: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204611.94685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204611.94687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204611.94690: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204611.94692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204611.94749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204611.94770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204611.94777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204611.94973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204614.16669: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped<<< 44071 1727204614.16751: stdout chunk (state=3): >>>", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.se<<< 44071 1727204614.16779: stdout chunk (state=3): >>>rvice", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44071 1727204614.18416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204614.18420: stdout chunk (state=3): >>><<< 44071 1727204614.18487: stderr chunk (state=3): >>><<< 44071 1727204614.18494: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204614.19817: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204611.7830782-45204-244484853932438/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204614.19827: _low_level_execute_command(): starting 44071 1727204614.19830: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204611.7830782-45204-244484853932438/ > /dev/null 2>&1 && sleep 0' 44071 1727204614.20680: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204614.20685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204614.20688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204614.20690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204614.20692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204614.20695: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204614.20697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204614.20699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204614.20701: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204614.20703: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204614.20705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204614.20707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204614.20709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204614.20712: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204614.20714: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204614.20716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204614.20718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204614.20720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204614.20747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204614.20861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204614.22947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204614.22952: stdout chunk (state=3): >>><<< 44071 1727204614.22954: stderr chunk (state=3): >>><<< 44071 1727204614.23081: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204614.23085: handler run complete 44071 1727204614.23263: variable 'ansible_facts' from source: unknown 44071 1727204614.23492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204614.24168: variable 'ansible_facts' from source: unknown 44071 1727204614.24357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204614.24672: attempt loop complete, returning result 44071 1727204614.24687: _execute() done 44071 1727204614.24738: dumping result to json 44071 1727204614.24790: done dumping result, returning 44071 1727204614.24807: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-c964-7471-000000000791] 44071 1727204614.24818: sending task result for task 127b8e07-fff9-c964-7471-000000000791 44071 1727204614.26551: done sending task result for task 127b8e07-fff9-c964-7471-000000000791 44071 1727204614.26555: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204614.26670: no more pending results, returning what we have 44071 1727204614.26673: results queue empty 44071 1727204614.26674: checking for any_errors_fatal 44071 1727204614.26678: done checking for any_errors_fatal 44071 1727204614.26679: checking for max_fail_percentage 44071 1727204614.26681: done checking for max_fail_percentage 44071 1727204614.26681: checking to see if all hosts have failed and the running result is not ok 44071 1727204614.26682: done checking to see if all hosts have failed 44071 1727204614.26683: getting the remaining hosts for this loop 44071 1727204614.26684: done getting the remaining hosts for this loop 44071 1727204614.26688: getting the next task for host managed-node2 44071 1727204614.26695: done getting next task for host managed-node2 44071 1727204614.26699: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204614.26705: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204614.26720: getting variables 44071 1727204614.26722: in VariableManager get_vars() 44071 1727204614.26750: Calling all_inventory to load vars for managed-node2 44071 1727204614.26753: Calling groups_inventory to load vars for managed-node2 44071 1727204614.26763: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204614.26775: Calling all_plugins_play to load vars for managed-node2 44071 1727204614.26779: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204614.26782: Calling groups_plugins_play to load vars for managed-node2 44071 1727204614.28532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204614.30950: done with get_vars() 44071 1727204614.30996: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:34 -0400 (0:00:02.606) 0:00:26.627 ***** 44071 1727204614.31118: entering _queue_task() for managed-node2/package_facts 44071 1727204614.31553: worker is 1 (out of 1 available) 44071 1727204614.31786: exiting _queue_task() for managed-node2/package_facts 44071 1727204614.31797: done queuing things up, now waiting for results queue to drain 44071 1727204614.31799: waiting for pending results... 44071 1727204614.31930: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204614.32174: in run() - task 127b8e07-fff9-c964-7471-000000000792 44071 1727204614.32179: variable 'ansible_search_path' from source: unknown 44071 1727204614.32182: variable 'ansible_search_path' from source: unknown 44071 1727204614.32218: calling self._execute() 44071 1727204614.32342: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204614.32370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204614.32432: variable 'omit' from source: magic vars 44071 1727204614.32834: variable 'ansible_distribution_major_version' from source: facts 44071 1727204614.32855: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204614.32878: variable 'omit' from source: magic vars 44071 1727204614.32980: variable 'omit' from source: magic vars 44071 1727204614.33027: variable 'omit' from source: magic vars 44071 1727204614.33088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204614.33140: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204614.33194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204614.33200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204614.33224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204614.33260: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204614.33330: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204614.33333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204614.33600: Set connection var ansible_connection to ssh 44071 1727204614.33604: Set connection var ansible_timeout to 10 44071 1727204614.33607: Set connection var ansible_pipelining to False 44071 1727204614.33609: Set connection var ansible_shell_type to sh 44071 1727204614.33612: Set connection var ansible_shell_executable to /bin/sh 44071 1727204614.33614: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204614.33616: variable 'ansible_shell_executable' from source: unknown 44071 1727204614.33618: variable 'ansible_connection' from source: unknown 44071 1727204614.33622: variable 'ansible_module_compression' from source: unknown 44071 1727204614.33624: variable 'ansible_shell_type' from source: unknown 44071 1727204614.33626: variable 'ansible_shell_executable' from source: unknown 44071 1727204614.33628: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204614.33649: variable 'ansible_pipelining' from source: unknown 44071 1727204614.33657: variable 'ansible_timeout' from source: unknown 44071 1727204614.33667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204614.33929: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204614.33949: variable 'omit' from source: magic vars 44071 1727204614.33959: starting attempt loop 44071 1727204614.33968: running the handler 44071 1727204614.33988: _low_level_execute_command(): starting 44071 1727204614.34000: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204614.34847: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204614.34915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204614.34986: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204614.34992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204614.35048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204614.35068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204614.35100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204614.35206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204614.37002: stdout chunk (state=3): >>>/root <<< 44071 1727204614.37208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204614.37212: stdout chunk (state=3): >>><<< 44071 1727204614.37214: stderr chunk (state=3): >>><<< 44071 1727204614.37242: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204614.37363: _low_level_execute_command(): starting 44071 1727204614.37370: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204614.3725157-45411-3663028373352 `" && echo ansible-tmp-1727204614.3725157-45411-3663028373352="` echo /root/.ansible/tmp/ansible-tmp-1727204614.3725157-45411-3663028373352 `" ) && sleep 0' 44071 1727204614.38012: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204614.38033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204614.38053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204614.38076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204614.38093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204614.38153: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204614.38227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204614.38261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204614.38377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204614.40404: stdout chunk (state=3): >>>ansible-tmp-1727204614.3725157-45411-3663028373352=/root/.ansible/tmp/ansible-tmp-1727204614.3725157-45411-3663028373352 <<< 44071 1727204614.40597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204614.40624: stderr chunk (state=3): >>><<< 44071 1727204614.40636: stdout chunk (state=3): >>><<< 44071 1727204614.40671: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204614.3725157-45411-3663028373352=/root/.ansible/tmp/ansible-tmp-1727204614.3725157-45411-3663028373352 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204614.40745: variable 'ansible_module_compression' from source: unknown 44071 1727204614.40813: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44071 1727204614.40893: variable 'ansible_facts' from source: unknown 44071 1727204614.41138: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204614.3725157-45411-3663028373352/AnsiballZ_package_facts.py 44071 1727204614.41406: Sending initial data 44071 1727204614.41410: Sent initial data (160 bytes) 44071 1727204614.42052: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204614.42082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204614.42097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204614.42191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204614.42230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204614.42253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204614.42291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204614.42396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204614.44079: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204614.44164: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204614.44247: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpao6t2duz /root/.ansible/tmp/ansible-tmp-1727204614.3725157-45411-3663028373352/AnsiballZ_package_facts.py <<< 44071 1727204614.44257: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204614.3725157-45411-3663028373352/AnsiballZ_package_facts.py" <<< 44071 1727204614.44320: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpao6t2duz" to remote "/root/.ansible/tmp/ansible-tmp-1727204614.3725157-45411-3663028373352/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204614.3725157-45411-3663028373352/AnsiballZ_package_facts.py" <<< 44071 1727204614.46083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204614.46137: stderr chunk (state=3): >>><<< 44071 1727204614.46210: stdout chunk (state=3): >>><<< 44071 1727204614.46213: done transferring module to remote 44071 1727204614.46217: _low_level_execute_command(): starting 44071 1727204614.46219: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204614.3725157-45411-3663028373352/ /root/.ansible/tmp/ansible-tmp-1727204614.3725157-45411-3663028373352/AnsiballZ_package_facts.py && sleep 0' 44071 1727204614.46991: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204614.47050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204614.47073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204614.47117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204614.47214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204614.49204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204614.49209: stdout chunk (state=3): >>><<< 44071 1727204614.49212: stderr chunk (state=3): >>><<< 44071 1727204614.49233: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204614.49246: _low_level_execute_command(): starting 44071 1727204614.49341: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204614.3725157-45411-3663028373352/AnsiballZ_package_facts.py && sleep 0' 44071 1727204614.50000: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204614.50030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204614.50050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204614.50073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204614.50093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204614.50134: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204614.50179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204614.50243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204614.50277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204614.50314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204614.50403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204615.12948: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 44071 1727204615.12977: stdout chunk (state=3): >>>me": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.<<< 44071 1727204615.13011: stdout chunk (state=3): >>>fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-l<<< 44071 1727204615.13042: stdout chunk (state=3): >>>ibs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lib<<< 44071 1727204615.13070: stdout chunk (state=3): >>>xmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_<<< 44071 1727204615.13079: stdout chunk (state=3): >>>64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": <<< 44071 1727204615.13084: stdout chunk (state=3): >>>"x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "<<< 44071 1727204615.13112: stdout chunk (state=3): >>>rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50<<< 44071 1727204615.13143: stdout chunk (state=3): >>>, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 44071 1727204615.13173: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "s<<< 44071 1727204615.13180: stdout chunk (state=3): >>>ource": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44071 1727204615.15056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204615.15113: stderr chunk (state=3): >>><<< 44071 1727204615.15117: stdout chunk (state=3): >>><<< 44071 1727204615.15167: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204615.17016: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204614.3725157-45411-3663028373352/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204615.17034: _low_level_execute_command(): starting 44071 1727204615.17037: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204614.3725157-45411-3663028373352/ > /dev/null 2>&1 && sleep 0' 44071 1727204615.17572: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204615.17576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204615.17580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204615.17635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204615.17641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204615.17645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204615.17715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204615.19657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204615.19725: stderr chunk (state=3): >>><<< 44071 1727204615.19729: stdout chunk (state=3): >>><<< 44071 1727204615.19747: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204615.19751: handler run complete 44071 1727204615.20452: variable 'ansible_facts' from source: unknown 44071 1727204615.20824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204615.22332: variable 'ansible_facts' from source: unknown 44071 1727204615.22695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204615.23270: attempt loop complete, returning result 44071 1727204615.23285: _execute() done 44071 1727204615.23288: dumping result to json 44071 1727204615.23447: done dumping result, returning 44071 1727204615.23456: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-c964-7471-000000000792] 44071 1727204615.23461: sending task result for task 127b8e07-fff9-c964-7471-000000000792 44071 1727204615.25479: done sending task result for task 127b8e07-fff9-c964-7471-000000000792 44071 1727204615.25483: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204615.25589: no more pending results, returning what we have 44071 1727204615.25591: results queue empty 44071 1727204615.25592: checking for any_errors_fatal 44071 1727204615.25596: done checking for any_errors_fatal 44071 1727204615.25596: checking for max_fail_percentage 44071 1727204615.25597: done checking for max_fail_percentage 44071 1727204615.25598: checking to see if all hosts have failed and the running result is not ok 44071 1727204615.25598: done checking to see if all hosts have failed 44071 1727204615.25599: getting the remaining hosts for this loop 44071 1727204615.25600: done getting the remaining hosts for this loop 44071 1727204615.25603: getting the next task for host managed-node2 44071 1727204615.25609: done getting next task for host managed-node2 44071 1727204615.25612: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204615.25616: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204615.25625: getting variables 44071 1727204615.25627: in VariableManager get_vars() 44071 1727204615.25653: Calling all_inventory to load vars for managed-node2 44071 1727204615.25655: Calling groups_inventory to load vars for managed-node2 44071 1727204615.25656: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204615.25664: Calling all_plugins_play to load vars for managed-node2 44071 1727204615.25667: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204615.25669: Calling groups_plugins_play to load vars for managed-node2 44071 1727204615.26625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204615.27954: done with get_vars() 44071 1727204615.27978: done getting variables 44071 1727204615.28036: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:35 -0400 (0:00:00.969) 0:00:27.597 ***** 44071 1727204615.28071: entering _queue_task() for managed-node2/debug 44071 1727204615.28361: worker is 1 (out of 1 available) 44071 1727204615.28379: exiting _queue_task() for managed-node2/debug 44071 1727204615.28392: done queuing things up, now waiting for results queue to drain 44071 1727204615.28394: waiting for pending results... 44071 1727204615.28607: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204615.28702: in run() - task 127b8e07-fff9-c964-7471-000000000730 44071 1727204615.28715: variable 'ansible_search_path' from source: unknown 44071 1727204615.28719: variable 'ansible_search_path' from source: unknown 44071 1727204615.28759: calling self._execute() 44071 1727204615.28855: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204615.28858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204615.28870: variable 'omit' from source: magic vars 44071 1727204615.29297: variable 'ansible_distribution_major_version' from source: facts 44071 1727204615.29302: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204615.29305: variable 'omit' from source: magic vars 44071 1727204615.29430: variable 'omit' from source: magic vars 44071 1727204615.29672: variable 'network_provider' from source: set_fact 44071 1727204615.29676: variable 'omit' from source: magic vars 44071 1727204615.29679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204615.29682: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204615.29684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204615.29686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204615.29688: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204615.29691: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204615.29693: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204615.29695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204615.29792: Set connection var ansible_connection to ssh 44071 1727204615.29797: Set connection var ansible_timeout to 10 44071 1727204615.29811: Set connection var ansible_pipelining to False 44071 1727204615.29821: Set connection var ansible_shell_type to sh 44071 1727204615.29823: Set connection var ansible_shell_executable to /bin/sh 44071 1727204615.29826: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204615.29851: variable 'ansible_shell_executable' from source: unknown 44071 1727204615.29855: variable 'ansible_connection' from source: unknown 44071 1727204615.29858: variable 'ansible_module_compression' from source: unknown 44071 1727204615.29860: variable 'ansible_shell_type' from source: unknown 44071 1727204615.29862: variable 'ansible_shell_executable' from source: unknown 44071 1727204615.29865: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204615.29869: variable 'ansible_pipelining' from source: unknown 44071 1727204615.29871: variable 'ansible_timeout' from source: unknown 44071 1727204615.29877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204615.30033: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204615.30046: variable 'omit' from source: magic vars 44071 1727204615.30049: starting attempt loop 44071 1727204615.30052: running the handler 44071 1727204615.30104: handler run complete 44071 1727204615.30121: attempt loop complete, returning result 44071 1727204615.30124: _execute() done 44071 1727204615.30127: dumping result to json 44071 1727204615.30130: done dumping result, returning 44071 1727204615.30141: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-c964-7471-000000000730] 44071 1727204615.30153: sending task result for task 127b8e07-fff9-c964-7471-000000000730 ok: [managed-node2] => {} MSG: Using network provider: nm 44071 1727204615.30320: no more pending results, returning what we have 44071 1727204615.30324: results queue empty 44071 1727204615.30324: checking for any_errors_fatal 44071 1727204615.30337: done checking for any_errors_fatal 44071 1727204615.30338: checking for max_fail_percentage 44071 1727204615.30340: done checking for max_fail_percentage 44071 1727204615.30341: checking to see if all hosts have failed and the running result is not ok 44071 1727204615.30341: done checking to see if all hosts have failed 44071 1727204615.30342: getting the remaining hosts for this loop 44071 1727204615.30344: done getting the remaining hosts for this loop 44071 1727204615.30348: getting the next task for host managed-node2 44071 1727204615.30358: done getting next task for host managed-node2 44071 1727204615.30362: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204615.30368: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204615.30379: getting variables 44071 1727204615.30381: in VariableManager get_vars() 44071 1727204615.30416: Calling all_inventory to load vars for managed-node2 44071 1727204615.30418: Calling groups_inventory to load vars for managed-node2 44071 1727204615.30420: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204615.30431: Calling all_plugins_play to load vars for managed-node2 44071 1727204615.30434: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204615.30437: Calling groups_plugins_play to load vars for managed-node2 44071 1727204615.31003: done sending task result for task 127b8e07-fff9-c964-7471-000000000730 44071 1727204615.31518: WORKER PROCESS EXITING 44071 1727204615.31636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204615.32869: done with get_vars() 44071 1727204615.32901: done getting variables 44071 1727204615.32954: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:35 -0400 (0:00:00.049) 0:00:27.646 ***** 44071 1727204615.32995: entering _queue_task() for managed-node2/fail 44071 1727204615.33332: worker is 1 (out of 1 available) 44071 1727204615.33352: exiting _queue_task() for managed-node2/fail 44071 1727204615.33368: done queuing things up, now waiting for results queue to drain 44071 1727204615.33370: waiting for pending results... 44071 1727204615.33626: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204615.33779: in run() - task 127b8e07-fff9-c964-7471-000000000731 44071 1727204615.33791: variable 'ansible_search_path' from source: unknown 44071 1727204615.33801: variable 'ansible_search_path' from source: unknown 44071 1727204615.33928: calling self._execute() 44071 1727204615.33937: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204615.33947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204615.33971: variable 'omit' from source: magic vars 44071 1727204615.34386: variable 'ansible_distribution_major_version' from source: facts 44071 1727204615.34572: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204615.34575: variable 'network_state' from source: role '' defaults 44071 1727204615.34579: Evaluated conditional (network_state != {}): False 44071 1727204615.34582: when evaluation is False, skipping this task 44071 1727204615.34585: _execute() done 44071 1727204615.34589: dumping result to json 44071 1727204615.34592: done dumping result, returning 44071 1727204615.34595: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-c964-7471-000000000731] 44071 1727204615.34598: sending task result for task 127b8e07-fff9-c964-7471-000000000731 44071 1727204615.34678: done sending task result for task 127b8e07-fff9-c964-7471-000000000731 44071 1727204615.34681: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204615.34735: no more pending results, returning what we have 44071 1727204615.34740: results queue empty 44071 1727204615.34741: checking for any_errors_fatal 44071 1727204615.34749: done checking for any_errors_fatal 44071 1727204615.34749: checking for max_fail_percentage 44071 1727204615.34751: done checking for max_fail_percentage 44071 1727204615.34752: checking to see if all hosts have failed and the running result is not ok 44071 1727204615.34752: done checking to see if all hosts have failed 44071 1727204615.34753: getting the remaining hosts for this loop 44071 1727204615.34755: done getting the remaining hosts for this loop 44071 1727204615.34760: getting the next task for host managed-node2 44071 1727204615.34771: done getting next task for host managed-node2 44071 1727204615.34775: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204615.34786: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204615.34809: getting variables 44071 1727204615.34811: in VariableManager get_vars() 44071 1727204615.34846: Calling all_inventory to load vars for managed-node2 44071 1727204615.34849: Calling groups_inventory to load vars for managed-node2 44071 1727204615.34851: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204615.34862: Calling all_plugins_play to load vars for managed-node2 44071 1727204615.34864: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204615.34955: Calling groups_plugins_play to load vars for managed-node2 44071 1727204615.36253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204615.37474: done with get_vars() 44071 1727204615.37508: done getting variables 44071 1727204615.37562: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:35 -0400 (0:00:00.045) 0:00:27.692 ***** 44071 1727204615.37594: entering _queue_task() for managed-node2/fail 44071 1727204615.37893: worker is 1 (out of 1 available) 44071 1727204615.37910: exiting _queue_task() for managed-node2/fail 44071 1727204615.37924: done queuing things up, now waiting for results queue to drain 44071 1727204615.37926: waiting for pending results... 44071 1727204615.38327: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204615.38333: in run() - task 127b8e07-fff9-c964-7471-000000000732 44071 1727204615.38440: variable 'ansible_search_path' from source: unknown 44071 1727204615.38447: variable 'ansible_search_path' from source: unknown 44071 1727204615.38451: calling self._execute() 44071 1727204615.38549: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204615.38553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204615.38556: variable 'omit' from source: magic vars 44071 1727204615.38954: variable 'ansible_distribution_major_version' from source: facts 44071 1727204615.38969: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204615.39140: variable 'network_state' from source: role '' defaults 44071 1727204615.39145: Evaluated conditional (network_state != {}): False 44071 1727204615.39149: when evaluation is False, skipping this task 44071 1727204615.39151: _execute() done 44071 1727204615.39154: dumping result to json 44071 1727204615.39158: done dumping result, returning 44071 1727204615.39168: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-c964-7471-000000000732] 44071 1727204615.39171: sending task result for task 127b8e07-fff9-c964-7471-000000000732 44071 1727204615.39421: done sending task result for task 127b8e07-fff9-c964-7471-000000000732 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204615.39478: no more pending results, returning what we have 44071 1727204615.39482: results queue empty 44071 1727204615.39484: checking for any_errors_fatal 44071 1727204615.39491: done checking for any_errors_fatal 44071 1727204615.39492: checking for max_fail_percentage 44071 1727204615.39493: done checking for max_fail_percentage 44071 1727204615.39494: checking to see if all hosts have failed and the running result is not ok 44071 1727204615.39495: done checking to see if all hosts have failed 44071 1727204615.39496: getting the remaining hosts for this loop 44071 1727204615.39498: done getting the remaining hosts for this loop 44071 1727204615.39502: getting the next task for host managed-node2 44071 1727204615.39511: done getting next task for host managed-node2 44071 1727204615.39516: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204615.39522: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204615.39538: WORKER PROCESS EXITING 44071 1727204615.39677: getting variables 44071 1727204615.39680: in VariableManager get_vars() 44071 1727204615.39720: Calling all_inventory to load vars for managed-node2 44071 1727204615.39724: Calling groups_inventory to load vars for managed-node2 44071 1727204615.39726: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204615.39737: Calling all_plugins_play to load vars for managed-node2 44071 1727204615.39743: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204615.39747: Calling groups_plugins_play to load vars for managed-node2 44071 1727204615.41482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204615.44295: done with get_vars() 44071 1727204615.44331: done getting variables 44071 1727204615.44609: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:35 -0400 (0:00:00.070) 0:00:27.762 ***** 44071 1727204615.44653: entering _queue_task() for managed-node2/fail 44071 1727204615.45308: worker is 1 (out of 1 available) 44071 1727204615.45321: exiting _queue_task() for managed-node2/fail 44071 1727204615.45333: done queuing things up, now waiting for results queue to drain 44071 1727204615.45334: waiting for pending results... 44071 1727204615.45511: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204615.45694: in run() - task 127b8e07-fff9-c964-7471-000000000733 44071 1727204615.45721: variable 'ansible_search_path' from source: unknown 44071 1727204615.45729: variable 'ansible_search_path' from source: unknown 44071 1727204615.45788: calling self._execute() 44071 1727204615.45906: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204615.45971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204615.45976: variable 'omit' from source: magic vars 44071 1727204615.46405: variable 'ansible_distribution_major_version' from source: facts 44071 1727204615.46428: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204615.46638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204615.49330: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204615.49503: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204615.49507: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204615.49556: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204615.49596: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204615.49700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204615.49742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204615.49775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204615.49872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204615.49876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204615.49971: variable 'ansible_distribution_major_version' from source: facts 44071 1727204615.49994: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44071 1727204615.50138: variable 'ansible_distribution' from source: facts 44071 1727204615.50158: variable '__network_rh_distros' from source: role '' defaults 44071 1727204615.50177: Evaluated conditional (ansible_distribution in __network_rh_distros): False 44071 1727204615.50185: when evaluation is False, skipping this task 44071 1727204615.50192: _execute() done 44071 1727204615.50200: dumping result to json 44071 1727204615.50259: done dumping result, returning 44071 1727204615.50263: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-c964-7471-000000000733] 44071 1727204615.50267: sending task result for task 127b8e07-fff9-c964-7471-000000000733 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 44071 1727204615.50417: no more pending results, returning what we have 44071 1727204615.50423: results queue empty 44071 1727204615.50424: checking for any_errors_fatal 44071 1727204615.50432: done checking for any_errors_fatal 44071 1727204615.50433: checking for max_fail_percentage 44071 1727204615.50435: done checking for max_fail_percentage 44071 1727204615.50436: checking to see if all hosts have failed and the running result is not ok 44071 1727204615.50437: done checking to see if all hosts have failed 44071 1727204615.50437: getting the remaining hosts for this loop 44071 1727204615.50441: done getting the remaining hosts for this loop 44071 1727204615.50447: getting the next task for host managed-node2 44071 1727204615.50455: done getting next task for host managed-node2 44071 1727204615.50459: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204615.50466: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204615.50486: getting variables 44071 1727204615.50488: in VariableManager get_vars() 44071 1727204615.50526: Calling all_inventory to load vars for managed-node2 44071 1727204615.50529: Calling groups_inventory to load vars for managed-node2 44071 1727204615.50531: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204615.50547: Calling all_plugins_play to load vars for managed-node2 44071 1727204615.50550: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204615.50554: Calling groups_plugins_play to load vars for managed-node2 44071 1727204615.51322: done sending task result for task 127b8e07-fff9-c964-7471-000000000733 44071 1727204615.51326: WORKER PROCESS EXITING 44071 1727204615.52626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204615.54828: done with get_vars() 44071 1727204615.54876: done getting variables 44071 1727204615.54951: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:35 -0400 (0:00:00.103) 0:00:27.866 ***** 44071 1727204615.54996: entering _queue_task() for managed-node2/dnf 44071 1727204615.55412: worker is 1 (out of 1 available) 44071 1727204615.55427: exiting _queue_task() for managed-node2/dnf 44071 1727204615.55444: done queuing things up, now waiting for results queue to drain 44071 1727204615.55445: waiting for pending results... 44071 1727204615.55807: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204615.55980: in run() - task 127b8e07-fff9-c964-7471-000000000734 44071 1727204615.56008: variable 'ansible_search_path' from source: unknown 44071 1727204615.56017: variable 'ansible_search_path' from source: unknown 44071 1727204615.56070: calling self._execute() 44071 1727204615.56182: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204615.56199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204615.56214: variable 'omit' from source: magic vars 44071 1727204615.56645: variable 'ansible_distribution_major_version' from source: facts 44071 1727204615.56667: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204615.56911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204615.59484: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204615.59590: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204615.59636: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204615.59686: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204615.59720: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204615.59815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204615.59853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204615.59889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204615.59944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204615.59963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204615.60107: variable 'ansible_distribution' from source: facts 44071 1727204615.60123: variable 'ansible_distribution_major_version' from source: facts 44071 1727204615.60192: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44071 1727204615.60279: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204615.60433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204615.60473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204615.60603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204615.60627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204615.60653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204615.60710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204615.60753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204615.60787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204615.60847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204615.60871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204615.60941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204615.60969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204615.61049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204615.61058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204615.61081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204615.61296: variable 'network_connections' from source: include params 44071 1727204615.61475: variable 'interface' from source: play vars 44071 1727204615.61517: variable 'interface' from source: play vars 44071 1727204615.61671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204615.61972: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204615.62051: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204615.62136: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204615.62209: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204615.62486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204615.63477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204615.63490: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204615.63493: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204615.63636: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204615.64082: variable 'network_connections' from source: include params 44071 1727204615.64096: variable 'interface' from source: play vars 44071 1727204615.64188: variable 'interface' from source: play vars 44071 1727204615.64246: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204615.64255: when evaluation is False, skipping this task 44071 1727204615.64373: _execute() done 44071 1727204615.64376: dumping result to json 44071 1727204615.64379: done dumping result, returning 44071 1727204615.64457: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000000734] 44071 1727204615.64461: sending task result for task 127b8e07-fff9-c964-7471-000000000734 44071 1727204615.64554: done sending task result for task 127b8e07-fff9-c964-7471-000000000734 44071 1727204615.64557: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204615.64626: no more pending results, returning what we have 44071 1727204615.64630: results queue empty 44071 1727204615.64631: checking for any_errors_fatal 44071 1727204615.64642: done checking for any_errors_fatal 44071 1727204615.64643: checking for max_fail_percentage 44071 1727204615.64645: done checking for max_fail_percentage 44071 1727204615.64646: checking to see if all hosts have failed and the running result is not ok 44071 1727204615.64647: done checking to see if all hosts have failed 44071 1727204615.64648: getting the remaining hosts for this loop 44071 1727204615.64650: done getting the remaining hosts for this loop 44071 1727204615.64655: getting the next task for host managed-node2 44071 1727204615.64665: done getting next task for host managed-node2 44071 1727204615.64803: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204615.64809: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204615.64830: getting variables 44071 1727204615.64832: in VariableManager get_vars() 44071 1727204615.64883: Calling all_inventory to load vars for managed-node2 44071 1727204615.64886: Calling groups_inventory to load vars for managed-node2 44071 1727204615.64888: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204615.64901: Calling all_plugins_play to load vars for managed-node2 44071 1727204615.64904: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204615.64907: Calling groups_plugins_play to load vars for managed-node2 44071 1727204615.76985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204615.79370: done with get_vars() 44071 1727204615.79417: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204615.79509: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:35 -0400 (0:00:00.245) 0:00:28.111 ***** 44071 1727204615.79544: entering _queue_task() for managed-node2/yum 44071 1727204615.80187: worker is 1 (out of 1 available) 44071 1727204615.80200: exiting _queue_task() for managed-node2/yum 44071 1727204615.80213: done queuing things up, now waiting for results queue to drain 44071 1727204615.80216: waiting for pending results... 44071 1727204615.80783: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204615.80810: in run() - task 127b8e07-fff9-c964-7471-000000000735 44071 1727204615.80835: variable 'ansible_search_path' from source: unknown 44071 1727204615.80847: variable 'ansible_search_path' from source: unknown 44071 1727204615.80911: calling self._execute() 44071 1727204615.81013: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204615.81020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204615.81028: variable 'omit' from source: magic vars 44071 1727204615.81359: variable 'ansible_distribution_major_version' from source: facts 44071 1727204615.81373: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204615.81514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204615.84819: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204615.84937: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204615.85036: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204615.85054: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204615.85099: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204615.85209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204615.85272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204615.85362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204615.85369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204615.85395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204615.85522: variable 'ansible_distribution_major_version' from source: facts 44071 1727204615.85552: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44071 1727204615.85561: when evaluation is False, skipping this task 44071 1727204615.85580: _execute() done 44071 1727204615.85593: dumping result to json 44071 1727204615.85601: done dumping result, returning 44071 1727204615.85618: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000000735] 44071 1727204615.85684: sending task result for task 127b8e07-fff9-c964-7471-000000000735 44071 1727204615.85788: done sending task result for task 127b8e07-fff9-c964-7471-000000000735 44071 1727204615.85791: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44071 1727204615.85863: no more pending results, returning what we have 44071 1727204615.85869: results queue empty 44071 1727204615.85871: checking for any_errors_fatal 44071 1727204615.85883: done checking for any_errors_fatal 44071 1727204615.85884: checking for max_fail_percentage 44071 1727204615.85886: done checking for max_fail_percentage 44071 1727204615.85887: checking to see if all hosts have failed and the running result is not ok 44071 1727204615.85888: done checking to see if all hosts have failed 44071 1727204615.85889: getting the remaining hosts for this loop 44071 1727204615.85891: done getting the remaining hosts for this loop 44071 1727204615.85896: getting the next task for host managed-node2 44071 1727204615.85908: done getting next task for host managed-node2 44071 1727204615.85912: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204615.85918: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204615.85948: getting variables 44071 1727204615.85950: in VariableManager get_vars() 44071 1727204615.86220: Calling all_inventory to load vars for managed-node2 44071 1727204615.86223: Calling groups_inventory to load vars for managed-node2 44071 1727204615.86225: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204615.86237: Calling all_plugins_play to load vars for managed-node2 44071 1727204615.86243: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204615.86246: Calling groups_plugins_play to load vars for managed-node2 44071 1727204615.88434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204615.90964: done with get_vars() 44071 1727204615.90994: done getting variables 44071 1727204615.91047: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:35 -0400 (0:00:00.115) 0:00:28.227 ***** 44071 1727204615.91079: entering _queue_task() for managed-node2/fail 44071 1727204615.91367: worker is 1 (out of 1 available) 44071 1727204615.91384: exiting _queue_task() for managed-node2/fail 44071 1727204615.91397: done queuing things up, now waiting for results queue to drain 44071 1727204615.91399: waiting for pending results... 44071 1727204615.91620: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204615.91736: in run() - task 127b8e07-fff9-c964-7471-000000000736 44071 1727204615.91751: variable 'ansible_search_path' from source: unknown 44071 1727204615.91755: variable 'ansible_search_path' from source: unknown 44071 1727204615.91793: calling self._execute() 44071 1727204615.91877: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204615.91884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204615.91893: variable 'omit' from source: magic vars 44071 1727204615.92216: variable 'ansible_distribution_major_version' from source: facts 44071 1727204615.92227: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204615.92329: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204615.92487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204615.94497: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204615.94560: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204615.94595: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204615.94621: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204615.94641: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204615.94713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204615.94734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204615.94756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204615.94787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204615.94804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204615.94838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204615.94858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204615.94882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204615.94913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204615.94924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204615.94958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204615.94977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204615.94995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204615.95024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204615.95037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204615.95169: variable 'network_connections' from source: include params 44071 1727204615.95181: variable 'interface' from source: play vars 44071 1727204615.95239: variable 'interface' from source: play vars 44071 1727204615.95299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204615.95423: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204615.95456: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204615.95495: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204615.95518: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204615.95554: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204615.95576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204615.95595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204615.95613: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204615.95668: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204615.95850: variable 'network_connections' from source: include params 44071 1727204615.95855: variable 'interface' from source: play vars 44071 1727204615.95910: variable 'interface' from source: play vars 44071 1727204615.95937: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204615.95941: when evaluation is False, skipping this task 44071 1727204615.95944: _execute() done 44071 1727204615.95949: dumping result to json 44071 1727204615.95952: done dumping result, returning 44071 1727204615.95961: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000000736] 44071 1727204615.95966: sending task result for task 127b8e07-fff9-c964-7471-000000000736 44071 1727204615.96072: done sending task result for task 127b8e07-fff9-c964-7471-000000000736 44071 1727204615.96074: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204615.96145: no more pending results, returning what we have 44071 1727204615.96149: results queue empty 44071 1727204615.96150: checking for any_errors_fatal 44071 1727204615.96155: done checking for any_errors_fatal 44071 1727204615.96155: checking for max_fail_percentage 44071 1727204615.96157: done checking for max_fail_percentage 44071 1727204615.96158: checking to see if all hosts have failed and the running result is not ok 44071 1727204615.96158: done checking to see if all hosts have failed 44071 1727204615.96159: getting the remaining hosts for this loop 44071 1727204615.96161: done getting the remaining hosts for this loop 44071 1727204615.96168: getting the next task for host managed-node2 44071 1727204615.96176: done getting next task for host managed-node2 44071 1727204615.96180: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44071 1727204615.96191: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204615.96210: getting variables 44071 1727204615.96211: in VariableManager get_vars() 44071 1727204615.96248: Calling all_inventory to load vars for managed-node2 44071 1727204615.96251: Calling groups_inventory to load vars for managed-node2 44071 1727204615.96253: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204615.96263: Calling all_plugins_play to load vars for managed-node2 44071 1727204615.96269: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204615.96272: Calling groups_plugins_play to load vars for managed-node2 44071 1727204615.97407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204615.98635: done with get_vars() 44071 1727204615.98661: done getting variables 44071 1727204615.98714: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:35 -0400 (0:00:00.076) 0:00:28.303 ***** 44071 1727204615.98743: entering _queue_task() for managed-node2/package 44071 1727204615.99031: worker is 1 (out of 1 available) 44071 1727204615.99049: exiting _queue_task() for managed-node2/package 44071 1727204615.99063: done queuing things up, now waiting for results queue to drain 44071 1727204615.99064: waiting for pending results... 44071 1727204615.99272: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 44071 1727204615.99392: in run() - task 127b8e07-fff9-c964-7471-000000000737 44071 1727204615.99412: variable 'ansible_search_path' from source: unknown 44071 1727204615.99416: variable 'ansible_search_path' from source: unknown 44071 1727204615.99447: calling self._execute() 44071 1727204615.99533: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204615.99537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204615.99545: variable 'omit' from source: magic vars 44071 1727204615.99879: variable 'ansible_distribution_major_version' from source: facts 44071 1727204615.99890: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204616.00047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204616.00258: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204616.00302: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204616.00328: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204616.00394: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204616.00490: variable 'network_packages' from source: role '' defaults 44071 1727204616.00577: variable '__network_provider_setup' from source: role '' defaults 44071 1727204616.00588: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204616.00642: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204616.00648: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204616.00696: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204616.00830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204616.02332: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204616.02386: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204616.02415: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204616.02444: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204616.02464: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204616.02671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204616.02694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204616.02715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.02745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204616.02756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204616.02795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204616.02816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204616.02836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.02871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204616.02878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204616.03044: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204616.03131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204616.03148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204616.03168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.03195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204616.03206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204616.03280: variable 'ansible_python' from source: facts 44071 1727204616.03294: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204616.03357: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204616.03417: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204616.03512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204616.03529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204616.03548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.03583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204616.03594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204616.03631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204616.03652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204616.03673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.03702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204616.03713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204616.03828: variable 'network_connections' from source: include params 44071 1727204616.03834: variable 'interface' from source: play vars 44071 1727204616.03916: variable 'interface' from source: play vars 44071 1727204616.03977: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204616.04020: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204616.04045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.04068: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204616.04111: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204616.04306: variable 'network_connections' from source: include params 44071 1727204616.04310: variable 'interface' from source: play vars 44071 1727204616.04392: variable 'interface' from source: play vars 44071 1727204616.04450: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204616.04510: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204616.04724: variable 'network_connections' from source: include params 44071 1727204616.04728: variable 'interface' from source: play vars 44071 1727204616.04780: variable 'interface' from source: play vars 44071 1727204616.04801: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204616.04987: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204616.05211: variable 'network_connections' from source: include params 44071 1727204616.05215: variable 'interface' from source: play vars 44071 1727204616.05313: variable 'interface' from source: play vars 44071 1727204616.05549: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204616.05552: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204616.05554: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204616.05557: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204616.05722: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204616.06278: variable 'network_connections' from source: include params 44071 1727204616.06282: variable 'interface' from source: play vars 44071 1727204616.06355: variable 'interface' from source: play vars 44071 1727204616.06404: variable 'ansible_distribution' from source: facts 44071 1727204616.06407: variable '__network_rh_distros' from source: role '' defaults 44071 1727204616.06409: variable 'ansible_distribution_major_version' from source: facts 44071 1727204616.06412: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204616.06692: variable 'ansible_distribution' from source: facts 44071 1727204616.06696: variable '__network_rh_distros' from source: role '' defaults 44071 1727204616.06699: variable 'ansible_distribution_major_version' from source: facts 44071 1727204616.06701: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204616.06930: variable 'ansible_distribution' from source: facts 44071 1727204616.06933: variable '__network_rh_distros' from source: role '' defaults 44071 1727204616.06936: variable 'ansible_distribution_major_version' from source: facts 44071 1727204616.06939: variable 'network_provider' from source: set_fact 44071 1727204616.06941: variable 'ansible_facts' from source: unknown 44071 1727204616.08174: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44071 1727204616.08179: when evaluation is False, skipping this task 44071 1727204616.08182: _execute() done 44071 1727204616.08185: dumping result to json 44071 1727204616.08187: done dumping result, returning 44071 1727204616.08190: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-c964-7471-000000000737] 44071 1727204616.08192: sending task result for task 127b8e07-fff9-c964-7471-000000000737 44071 1727204616.08281: done sending task result for task 127b8e07-fff9-c964-7471-000000000737 44071 1727204616.08285: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44071 1727204616.08348: no more pending results, returning what we have 44071 1727204616.08352: results queue empty 44071 1727204616.08353: checking for any_errors_fatal 44071 1727204616.08362: done checking for any_errors_fatal 44071 1727204616.08363: checking for max_fail_percentage 44071 1727204616.08367: done checking for max_fail_percentage 44071 1727204616.08368: checking to see if all hosts have failed and the running result is not ok 44071 1727204616.08369: done checking to see if all hosts have failed 44071 1727204616.08370: getting the remaining hosts for this loop 44071 1727204616.08372: done getting the remaining hosts for this loop 44071 1727204616.08378: getting the next task for host managed-node2 44071 1727204616.08387: done getting next task for host managed-node2 44071 1727204616.08391: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204616.08400: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204616.08420: getting variables 44071 1727204616.08423: in VariableManager get_vars() 44071 1727204616.08592: Calling all_inventory to load vars for managed-node2 44071 1727204616.08599: Calling groups_inventory to load vars for managed-node2 44071 1727204616.08602: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204616.08615: Calling all_plugins_play to load vars for managed-node2 44071 1727204616.08618: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204616.08621: Calling groups_plugins_play to load vars for managed-node2 44071 1727204616.10330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204616.12234: done with get_vars() 44071 1727204616.12273: done getting variables 44071 1727204616.12334: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.136) 0:00:28.440 ***** 44071 1727204616.12374: entering _queue_task() for managed-node2/package 44071 1727204616.12747: worker is 1 (out of 1 available) 44071 1727204616.12761: exiting _queue_task() for managed-node2/package 44071 1727204616.12776: done queuing things up, now waiting for results queue to drain 44071 1727204616.12778: waiting for pending results... 44071 1727204616.12996: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204616.13103: in run() - task 127b8e07-fff9-c964-7471-000000000738 44071 1727204616.13116: variable 'ansible_search_path' from source: unknown 44071 1727204616.13119: variable 'ansible_search_path' from source: unknown 44071 1727204616.13156: calling self._execute() 44071 1727204616.13243: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204616.13249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204616.13258: variable 'omit' from source: magic vars 44071 1727204616.13575: variable 'ansible_distribution_major_version' from source: facts 44071 1727204616.13588: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204616.13721: variable 'network_state' from source: role '' defaults 44071 1727204616.13725: Evaluated conditional (network_state != {}): False 44071 1727204616.13728: when evaluation is False, skipping this task 44071 1727204616.13730: _execute() done 44071 1727204616.13732: dumping result to json 44071 1727204616.13735: done dumping result, returning 44071 1727204616.13738: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-c964-7471-000000000738] 44071 1727204616.13744: sending task result for task 127b8e07-fff9-c964-7471-000000000738 44071 1727204616.13945: done sending task result for task 127b8e07-fff9-c964-7471-000000000738 44071 1727204616.13948: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204616.14136: no more pending results, returning what we have 44071 1727204616.14140: results queue empty 44071 1727204616.14141: checking for any_errors_fatal 44071 1727204616.14147: done checking for any_errors_fatal 44071 1727204616.14148: checking for max_fail_percentage 44071 1727204616.14150: done checking for max_fail_percentage 44071 1727204616.14151: checking to see if all hosts have failed and the running result is not ok 44071 1727204616.14152: done checking to see if all hosts have failed 44071 1727204616.14153: getting the remaining hosts for this loop 44071 1727204616.14154: done getting the remaining hosts for this loop 44071 1727204616.14160: getting the next task for host managed-node2 44071 1727204616.14171: done getting next task for host managed-node2 44071 1727204616.14175: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204616.14181: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204616.14199: getting variables 44071 1727204616.14200: in VariableManager get_vars() 44071 1727204616.14236: Calling all_inventory to load vars for managed-node2 44071 1727204616.14239: Calling groups_inventory to load vars for managed-node2 44071 1727204616.14242: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204616.14253: Calling all_plugins_play to load vars for managed-node2 44071 1727204616.14256: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204616.14259: Calling groups_plugins_play to load vars for managed-node2 44071 1727204616.15903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204616.17960: done with get_vars() 44071 1727204616.17992: done getting variables 44071 1727204616.18043: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.057) 0:00:28.497 ***** 44071 1727204616.18077: entering _queue_task() for managed-node2/package 44071 1727204616.18360: worker is 1 (out of 1 available) 44071 1727204616.18379: exiting _queue_task() for managed-node2/package 44071 1727204616.18394: done queuing things up, now waiting for results queue to drain 44071 1727204616.18396: waiting for pending results... 44071 1727204616.18885: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204616.18896: in run() - task 127b8e07-fff9-c964-7471-000000000739 44071 1727204616.18899: variable 'ansible_search_path' from source: unknown 44071 1727204616.18902: variable 'ansible_search_path' from source: unknown 44071 1727204616.18905: calling self._execute() 44071 1727204616.18951: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204616.18963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204616.18978: variable 'omit' from source: magic vars 44071 1727204616.19391: variable 'ansible_distribution_major_version' from source: facts 44071 1727204616.19410: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204616.19550: variable 'network_state' from source: role '' defaults 44071 1727204616.19590: Evaluated conditional (network_state != {}): False 44071 1727204616.19599: when evaluation is False, skipping this task 44071 1727204616.19605: _execute() done 44071 1727204616.19612: dumping result to json 44071 1727204616.19618: done dumping result, returning 44071 1727204616.19773: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-c964-7471-000000000739] 44071 1727204616.19886: sending task result for task 127b8e07-fff9-c964-7471-000000000739 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204616.20057: no more pending results, returning what we have 44071 1727204616.20061: results queue empty 44071 1727204616.20062: checking for any_errors_fatal 44071 1727204616.20073: done checking for any_errors_fatal 44071 1727204616.20073: checking for max_fail_percentage 44071 1727204616.20075: done checking for max_fail_percentage 44071 1727204616.20269: checking to see if all hosts have failed and the running result is not ok 44071 1727204616.20271: done checking to see if all hosts have failed 44071 1727204616.20272: getting the remaining hosts for this loop 44071 1727204616.20274: done getting the remaining hosts for this loop 44071 1727204616.20281: getting the next task for host managed-node2 44071 1727204616.20291: done getting next task for host managed-node2 44071 1727204616.20297: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204616.20305: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204616.20329: getting variables 44071 1727204616.20332: in VariableManager get_vars() 44071 1727204616.20383: Calling all_inventory to load vars for managed-node2 44071 1727204616.20387: Calling groups_inventory to load vars for managed-node2 44071 1727204616.20389: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204616.20396: done sending task result for task 127b8e07-fff9-c964-7471-000000000739 44071 1727204616.20399: WORKER PROCESS EXITING 44071 1727204616.20411: Calling all_plugins_play to load vars for managed-node2 44071 1727204616.20415: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204616.20418: Calling groups_plugins_play to load vars for managed-node2 44071 1727204616.22087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204616.23323: done with get_vars() 44071 1727204616.23354: done getting variables 44071 1727204616.23433: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.053) 0:00:28.551 ***** 44071 1727204616.23475: entering _queue_task() for managed-node2/service 44071 1727204616.23879: worker is 1 (out of 1 available) 44071 1727204616.23894: exiting _queue_task() for managed-node2/service 44071 1727204616.23910: done queuing things up, now waiting for results queue to drain 44071 1727204616.23912: waiting for pending results... 44071 1727204616.24297: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204616.24480: in run() - task 127b8e07-fff9-c964-7471-00000000073a 44071 1727204616.24497: variable 'ansible_search_path' from source: unknown 44071 1727204616.24501: variable 'ansible_search_path' from source: unknown 44071 1727204616.24554: calling self._execute() 44071 1727204616.24681: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204616.24688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204616.24698: variable 'omit' from source: magic vars 44071 1727204616.25183: variable 'ansible_distribution_major_version' from source: facts 44071 1727204616.25202: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204616.25356: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204616.25587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204616.27998: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204616.28141: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204616.28146: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204616.28179: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204616.28208: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204616.28305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204616.28337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204616.28378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.28467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204616.28472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204616.28498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204616.28523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204616.28553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.28603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204616.28670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204616.28674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204616.28696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204616.28724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.28770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204616.28792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204616.28998: variable 'network_connections' from source: include params 44071 1727204616.29018: variable 'interface' from source: play vars 44071 1727204616.29101: variable 'interface' from source: play vars 44071 1727204616.29231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204616.29397: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204616.29654: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204616.29694: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204616.29870: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204616.29874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204616.29876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204616.29879: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.29881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204616.29919: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204616.30202: variable 'network_connections' from source: include params 44071 1727204616.30214: variable 'interface' from source: play vars 44071 1727204616.30285: variable 'interface' from source: play vars 44071 1727204616.30325: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204616.30330: when evaluation is False, skipping this task 44071 1727204616.30333: _execute() done 44071 1727204616.30337: dumping result to json 44071 1727204616.30340: done dumping result, returning 44071 1727204616.30356: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-00000000073a] 44071 1727204616.30361: sending task result for task 127b8e07-fff9-c964-7471-00000000073a skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204616.30542: no more pending results, returning what we have 44071 1727204616.30547: results queue empty 44071 1727204616.30548: checking for any_errors_fatal 44071 1727204616.30557: done checking for any_errors_fatal 44071 1727204616.30558: checking for max_fail_percentage 44071 1727204616.30560: done checking for max_fail_percentage 44071 1727204616.30561: checking to see if all hosts have failed and the running result is not ok 44071 1727204616.30562: done checking to see if all hosts have failed 44071 1727204616.30563: getting the remaining hosts for this loop 44071 1727204616.30565: done getting the remaining hosts for this loop 44071 1727204616.30572: getting the next task for host managed-node2 44071 1727204616.30581: done getting next task for host managed-node2 44071 1727204616.30586: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204616.30591: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204616.30616: getting variables 44071 1727204616.30618: in VariableManager get_vars() 44071 1727204616.30661: Calling all_inventory to load vars for managed-node2 44071 1727204616.30665: Calling groups_inventory to load vars for managed-node2 44071 1727204616.30972: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204616.30980: done sending task result for task 127b8e07-fff9-c964-7471-00000000073a 44071 1727204616.30985: WORKER PROCESS EXITING 44071 1727204616.30996: Calling all_plugins_play to load vars for managed-node2 44071 1727204616.30999: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204616.31003: Calling groups_plugins_play to load vars for managed-node2 44071 1727204616.32892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204616.35204: done with get_vars() 44071 1727204616.35262: done getting variables 44071 1727204616.35330: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.118) 0:00:28.670 ***** 44071 1727204616.35375: entering _queue_task() for managed-node2/service 44071 1727204616.35799: worker is 1 (out of 1 available) 44071 1727204616.35813: exiting _queue_task() for managed-node2/service 44071 1727204616.35828: done queuing things up, now waiting for results queue to drain 44071 1727204616.35830: waiting for pending results... 44071 1727204616.36201: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204616.36441: in run() - task 127b8e07-fff9-c964-7471-00000000073b 44071 1727204616.36446: variable 'ansible_search_path' from source: unknown 44071 1727204616.36450: variable 'ansible_search_path' from source: unknown 44071 1727204616.36463: calling self._execute() 44071 1727204616.36579: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204616.36586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204616.36596: variable 'omit' from source: magic vars 44071 1727204616.37096: variable 'ansible_distribution_major_version' from source: facts 44071 1727204616.37100: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204616.37289: variable 'network_provider' from source: set_fact 44071 1727204616.37293: variable 'network_state' from source: role '' defaults 44071 1727204616.37311: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44071 1727204616.37318: variable 'omit' from source: magic vars 44071 1727204616.37390: variable 'omit' from source: magic vars 44071 1727204616.37446: variable 'network_service_name' from source: role '' defaults 44071 1727204616.37528: variable 'network_service_name' from source: role '' defaults 44071 1727204616.37746: variable '__network_provider_setup' from source: role '' defaults 44071 1727204616.37750: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204616.37754: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204616.37758: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204616.37807: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204616.38093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204616.41719: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204616.41814: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204616.41954: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204616.41958: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204616.41961: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204616.42076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204616.42106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204616.42131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.42178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204616.42192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204616.42244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204616.42269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204616.42575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.42615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204616.42629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204616.43232: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204616.43874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204616.43881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204616.43884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.43886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204616.43889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204616.43953: variable 'ansible_python' from source: facts 44071 1727204616.43971: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204616.44277: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204616.44294: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204616.44648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204616.44688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204616.44858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.44862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204616.44867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204616.45344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204616.45363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204616.45392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.45433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204616.45449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204616.45822: variable 'network_connections' from source: include params 44071 1727204616.45872: variable 'interface' from source: play vars 44071 1727204616.46353: variable 'interface' from source: play vars 44071 1727204616.46740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204616.47650: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204616.47709: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204616.47757: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204616.47803: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204616.48232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204616.48347: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204616.48351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204616.48904: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204616.48908: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204616.49741: variable 'network_connections' from source: include params 44071 1727204616.49746: variable 'interface' from source: play vars 44071 1727204616.49860: variable 'interface' from source: play vars 44071 1727204616.49929: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204616.50031: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204616.50437: variable 'network_connections' from source: include params 44071 1727204616.50454: variable 'interface' from source: play vars 44071 1727204616.50559: variable 'interface' from source: play vars 44071 1727204616.50603: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204616.50713: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204616.51158: variable 'network_connections' from source: include params 44071 1727204616.51162: variable 'interface' from source: play vars 44071 1727204616.51249: variable 'interface' from source: play vars 44071 1727204616.51370: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204616.51538: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204616.51545: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204616.51930: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204616.52870: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204616.54575: variable 'network_connections' from source: include params 44071 1727204616.54580: variable 'interface' from source: play vars 44071 1727204616.54582: variable 'interface' from source: play vars 44071 1727204616.54600: variable 'ansible_distribution' from source: facts 44071 1727204616.54617: variable '__network_rh_distros' from source: role '' defaults 44071 1727204616.54829: variable 'ansible_distribution_major_version' from source: facts 44071 1727204616.54833: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204616.55129: variable 'ansible_distribution' from source: facts 44071 1727204616.55208: variable '__network_rh_distros' from source: role '' defaults 44071 1727204616.55220: variable 'ansible_distribution_major_version' from source: facts 44071 1727204616.55276: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204616.55551: variable 'ansible_distribution' from source: facts 44071 1727204616.55561: variable '__network_rh_distros' from source: role '' defaults 44071 1727204616.55579: variable 'ansible_distribution_major_version' from source: facts 44071 1727204616.55645: variable 'network_provider' from source: set_fact 44071 1727204616.55679: variable 'omit' from source: magic vars 44071 1727204616.55779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204616.55816: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204616.55849: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204616.56050: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204616.56053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204616.56055: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204616.56057: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204616.56059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204616.56190: Set connection var ansible_connection to ssh 44071 1727204616.56203: Set connection var ansible_timeout to 10 44071 1727204616.56213: Set connection var ansible_pipelining to False 44071 1727204616.56222: Set connection var ansible_shell_type to sh 44071 1727204616.56231: Set connection var ansible_shell_executable to /bin/sh 44071 1727204616.56246: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204616.56288: variable 'ansible_shell_executable' from source: unknown 44071 1727204616.56296: variable 'ansible_connection' from source: unknown 44071 1727204616.56303: variable 'ansible_module_compression' from source: unknown 44071 1727204616.56310: variable 'ansible_shell_type' from source: unknown 44071 1727204616.56315: variable 'ansible_shell_executable' from source: unknown 44071 1727204616.56322: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204616.56330: variable 'ansible_pipelining' from source: unknown 44071 1727204616.56336: variable 'ansible_timeout' from source: unknown 44071 1727204616.56348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204616.56490: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204616.56515: variable 'omit' from source: magic vars 44071 1727204616.56524: starting attempt loop 44071 1727204616.56597: running the handler 44071 1727204616.56653: variable 'ansible_facts' from source: unknown 44071 1727204616.58686: _low_level_execute_command(): starting 44071 1727204616.58892: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204616.59762: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204616.59820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204616.59908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204616.59929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204616.59958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204616.59974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204616.60124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204616.62015: stdout chunk (state=3): >>>/root <<< 44071 1727204616.62085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204616.62204: stderr chunk (state=3): >>><<< 44071 1727204616.62213: stdout chunk (state=3): >>><<< 44071 1727204616.62243: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204616.62262: _low_level_execute_command(): starting 44071 1727204616.62276: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204616.6225069-45555-60024578605418 `" && echo ansible-tmp-1727204616.6225069-45555-60024578605418="` echo /root/.ansible/tmp/ansible-tmp-1727204616.6225069-45555-60024578605418 `" ) && sleep 0' 44071 1727204616.63088: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204616.63127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204616.63151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204616.63176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204616.63278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204616.65263: stdout chunk (state=3): >>>ansible-tmp-1727204616.6225069-45555-60024578605418=/root/.ansible/tmp/ansible-tmp-1727204616.6225069-45555-60024578605418 <<< 44071 1727204616.65602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204616.65606: stderr chunk (state=3): >>><<< 44071 1727204616.65609: stdout chunk (state=3): >>><<< 44071 1727204616.65611: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204616.6225069-45555-60024578605418=/root/.ansible/tmp/ansible-tmp-1727204616.6225069-45555-60024578605418 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204616.65614: variable 'ansible_module_compression' from source: unknown 44071 1727204616.65616: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44071 1727204616.65676: variable 'ansible_facts' from source: unknown 44071 1727204616.65877: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204616.6225069-45555-60024578605418/AnsiballZ_systemd.py 44071 1727204616.66097: Sending initial data 44071 1727204616.66101: Sent initial data (155 bytes) 44071 1727204616.66778: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204616.66787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204616.66798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204616.66858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204616.66911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204616.66930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204616.66945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204616.67050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204616.68663: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204616.68733: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204616.68803: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpl42l8sfv /root/.ansible/tmp/ansible-tmp-1727204616.6225069-45555-60024578605418/AnsiballZ_systemd.py <<< 44071 1727204616.68806: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204616.6225069-45555-60024578605418/AnsiballZ_systemd.py" <<< 44071 1727204616.68874: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpl42l8sfv" to remote "/root/.ansible/tmp/ansible-tmp-1727204616.6225069-45555-60024578605418/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204616.6225069-45555-60024578605418/AnsiballZ_systemd.py" <<< 44071 1727204616.70491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204616.70551: stderr chunk (state=3): >>><<< 44071 1727204616.70555: stdout chunk (state=3): >>><<< 44071 1727204616.70578: done transferring module to remote 44071 1727204616.70588: _low_level_execute_command(): starting 44071 1727204616.70593: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204616.6225069-45555-60024578605418/ /root/.ansible/tmp/ansible-tmp-1727204616.6225069-45555-60024578605418/AnsiballZ_systemd.py && sleep 0' 44071 1727204616.71106: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204616.71112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204616.71115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204616.71117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204616.71173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204616.71186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204616.71190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204616.71263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204616.73139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204616.73292: stderr chunk (state=3): >>><<< 44071 1727204616.73295: stdout chunk (state=3): >>><<< 44071 1727204616.73298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204616.73301: _low_level_execute_command(): starting 44071 1727204616.73304: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204616.6225069-45555-60024578605418/AnsiballZ_systemd.py && sleep 0' 44071 1727204616.73862: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204616.73891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204616.73895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204616.73912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204616.73918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204616.73930: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204616.74022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204616.74032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204616.74059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204616.74142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204617.06323: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4509696", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3523244032", "CPUUsageNSec": "1472248000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitC<<< 44071 1727204617.06344: stdout chunk (state=3): >>>ORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext":<<< 44071 1727204617.06360: stdout chunk (state=3): >>> "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44071 1727204617.08217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204617.08285: stderr chunk (state=3): >>><<< 44071 1727204617.08288: stdout chunk (state=3): >>><<< 44071 1727204617.08304: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4509696", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3523244032", "CPUUsageNSec": "1472248000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204617.08445: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204616.6225069-45555-60024578605418/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204617.08464: _low_level_execute_command(): starting 44071 1727204617.08472: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204616.6225069-45555-60024578605418/ > /dev/null 2>&1 && sleep 0' 44071 1727204617.08996: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204617.09000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204617.09003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204617.09005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204617.09007: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204617.09009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204617.09063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204617.09074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204617.09077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204617.09144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204617.11054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204617.11137: stderr chunk (state=3): >>><<< 44071 1727204617.11141: stdout chunk (state=3): >>><<< 44071 1727204617.11157: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204617.11166: handler run complete 44071 1727204617.11214: attempt loop complete, returning result 44071 1727204617.11217: _execute() done 44071 1727204617.11220: dumping result to json 44071 1727204617.11236: done dumping result, returning 44071 1727204617.11248: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-c964-7471-00000000073b] 44071 1727204617.11252: sending task result for task 127b8e07-fff9-c964-7471-00000000073b 44071 1727204617.11636: done sending task result for task 127b8e07-fff9-c964-7471-00000000073b 44071 1727204617.11639: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204617.11704: no more pending results, returning what we have 44071 1727204617.11707: results queue empty 44071 1727204617.11708: checking for any_errors_fatal 44071 1727204617.11713: done checking for any_errors_fatal 44071 1727204617.11714: checking for max_fail_percentage 44071 1727204617.11715: done checking for max_fail_percentage 44071 1727204617.11716: checking to see if all hosts have failed and the running result is not ok 44071 1727204617.11717: done checking to see if all hosts have failed 44071 1727204617.11718: getting the remaining hosts for this loop 44071 1727204617.11719: done getting the remaining hosts for this loop 44071 1727204617.11723: getting the next task for host managed-node2 44071 1727204617.11730: done getting next task for host managed-node2 44071 1727204617.11733: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204617.11738: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204617.11754: getting variables 44071 1727204617.11756: in VariableManager get_vars() 44071 1727204617.11790: Calling all_inventory to load vars for managed-node2 44071 1727204617.11792: Calling groups_inventory to load vars for managed-node2 44071 1727204617.11795: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204617.11805: Calling all_plugins_play to load vars for managed-node2 44071 1727204617.11808: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204617.11811: Calling groups_plugins_play to load vars for managed-node2 44071 1727204617.12850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204617.14161: done with get_vars() 44071 1727204617.14185: done getting variables 44071 1727204617.14235: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.788) 0:00:29.459 ***** 44071 1727204617.14271: entering _queue_task() for managed-node2/service 44071 1727204617.14555: worker is 1 (out of 1 available) 44071 1727204617.14572: exiting _queue_task() for managed-node2/service 44071 1727204617.14587: done queuing things up, now waiting for results queue to drain 44071 1727204617.14589: waiting for pending results... 44071 1727204617.14799: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204617.14898: in run() - task 127b8e07-fff9-c964-7471-00000000073c 44071 1727204617.14912: variable 'ansible_search_path' from source: unknown 44071 1727204617.14915: variable 'ansible_search_path' from source: unknown 44071 1727204617.14954: calling self._execute() 44071 1727204617.15037: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204617.15043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204617.15052: variable 'omit' from source: magic vars 44071 1727204617.15375: variable 'ansible_distribution_major_version' from source: facts 44071 1727204617.15386: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204617.15474: variable 'network_provider' from source: set_fact 44071 1727204617.15494: Evaluated conditional (network_provider == "nm"): True 44071 1727204617.15568: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204617.15636: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204617.15780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204617.17501: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204617.17555: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204617.17590: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204617.17619: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204617.17642: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204617.17730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204617.17757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204617.17779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204617.17809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204617.17821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204617.17863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204617.17883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204617.17904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204617.17931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204617.17943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204617.17979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204617.17996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204617.18017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204617.18047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204617.18057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204617.18191: variable 'network_connections' from source: include params 44071 1727204617.18203: variable 'interface' from source: play vars 44071 1727204617.18263: variable 'interface' from source: play vars 44071 1727204617.18323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204617.18473: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204617.18502: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204617.18528: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204617.18555: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204617.18594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204617.18610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204617.18630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204617.18654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204617.18695: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204617.18894: variable 'network_connections' from source: include params 44071 1727204617.18898: variable 'interface' from source: play vars 44071 1727204617.18949: variable 'interface' from source: play vars 44071 1727204617.18985: Evaluated conditional (__network_wpa_supplicant_required): False 44071 1727204617.18989: when evaluation is False, skipping this task 44071 1727204617.18992: _execute() done 44071 1727204617.18996: dumping result to json 44071 1727204617.18998: done dumping result, returning 44071 1727204617.19007: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-c964-7471-00000000073c] 44071 1727204617.19019: sending task result for task 127b8e07-fff9-c964-7471-00000000073c 44071 1727204617.19108: done sending task result for task 127b8e07-fff9-c964-7471-00000000073c 44071 1727204617.19111: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44071 1727204617.19159: no more pending results, returning what we have 44071 1727204617.19162: results queue empty 44071 1727204617.19164: checking for any_errors_fatal 44071 1727204617.19196: done checking for any_errors_fatal 44071 1727204617.19197: checking for max_fail_percentage 44071 1727204617.19198: done checking for max_fail_percentage 44071 1727204617.19199: checking to see if all hosts have failed and the running result is not ok 44071 1727204617.19200: done checking to see if all hosts have failed 44071 1727204617.19200: getting the remaining hosts for this loop 44071 1727204617.19202: done getting the remaining hosts for this loop 44071 1727204617.19207: getting the next task for host managed-node2 44071 1727204617.19216: done getting next task for host managed-node2 44071 1727204617.19220: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204617.19226: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204617.19246: getting variables 44071 1727204617.19248: in VariableManager get_vars() 44071 1727204617.19317: Calling all_inventory to load vars for managed-node2 44071 1727204617.19320: Calling groups_inventory to load vars for managed-node2 44071 1727204617.19323: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204617.19333: Calling all_plugins_play to load vars for managed-node2 44071 1727204617.19336: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204617.19338: Calling groups_plugins_play to load vars for managed-node2 44071 1727204617.20382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204617.22258: done with get_vars() 44071 1727204617.22302: done getting variables 44071 1727204617.22375: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.081) 0:00:29.540 ***** 44071 1727204617.22413: entering _queue_task() for managed-node2/service 44071 1727204617.22815: worker is 1 (out of 1 available) 44071 1727204617.22830: exiting _queue_task() for managed-node2/service 44071 1727204617.22847: done queuing things up, now waiting for results queue to drain 44071 1727204617.22849: waiting for pending results... 44071 1727204617.23291: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204617.23362: in run() - task 127b8e07-fff9-c964-7471-00000000073d 44071 1727204617.23396: variable 'ansible_search_path' from source: unknown 44071 1727204617.23405: variable 'ansible_search_path' from source: unknown 44071 1727204617.23511: calling self._execute() 44071 1727204617.23590: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204617.23604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204617.23624: variable 'omit' from source: magic vars 44071 1727204617.24076: variable 'ansible_distribution_major_version' from source: facts 44071 1727204617.24098: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204617.24238: variable 'network_provider' from source: set_fact 44071 1727204617.24251: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204617.24259: when evaluation is False, skipping this task 44071 1727204617.24371: _execute() done 44071 1727204617.24375: dumping result to json 44071 1727204617.24379: done dumping result, returning 44071 1727204617.24382: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-c964-7471-00000000073d] 44071 1727204617.24385: sending task result for task 127b8e07-fff9-c964-7471-00000000073d 44071 1727204617.24476: done sending task result for task 127b8e07-fff9-c964-7471-00000000073d 44071 1727204617.24480: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204617.24533: no more pending results, returning what we have 44071 1727204617.24538: results queue empty 44071 1727204617.24540: checking for any_errors_fatal 44071 1727204617.24549: done checking for any_errors_fatal 44071 1727204617.24550: checking for max_fail_percentage 44071 1727204617.24552: done checking for max_fail_percentage 44071 1727204617.24553: checking to see if all hosts have failed and the running result is not ok 44071 1727204617.24554: done checking to see if all hosts have failed 44071 1727204617.24554: getting the remaining hosts for this loop 44071 1727204617.24556: done getting the remaining hosts for this loop 44071 1727204617.24562: getting the next task for host managed-node2 44071 1727204617.24574: done getting next task for host managed-node2 44071 1727204617.24579: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204617.24586: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204617.24609: getting variables 44071 1727204617.24612: in VariableManager get_vars() 44071 1727204617.24654: Calling all_inventory to load vars for managed-node2 44071 1727204617.24658: Calling groups_inventory to load vars for managed-node2 44071 1727204617.24660: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204617.24882: Calling all_plugins_play to load vars for managed-node2 44071 1727204617.24887: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204617.24891: Calling groups_plugins_play to load vars for managed-node2 44071 1727204617.26938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204617.28800: done with get_vars() 44071 1727204617.28832: done getting variables 44071 1727204617.28887: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.065) 0:00:29.605 ***** 44071 1727204617.28916: entering _queue_task() for managed-node2/copy 44071 1727204617.29209: worker is 1 (out of 1 available) 44071 1727204617.29226: exiting _queue_task() for managed-node2/copy 44071 1727204617.29241: done queuing things up, now waiting for results queue to drain 44071 1727204617.29242: waiting for pending results... 44071 1727204617.29469: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204617.29591: in run() - task 127b8e07-fff9-c964-7471-00000000073e 44071 1727204617.29601: variable 'ansible_search_path' from source: unknown 44071 1727204617.29605: variable 'ansible_search_path' from source: unknown 44071 1727204617.29644: calling self._execute() 44071 1727204617.29726: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204617.29730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204617.29736: variable 'omit' from source: magic vars 44071 1727204617.30044: variable 'ansible_distribution_major_version' from source: facts 44071 1727204617.30053: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204617.30143: variable 'network_provider' from source: set_fact 44071 1727204617.30148: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204617.30152: when evaluation is False, skipping this task 44071 1727204617.30156: _execute() done 44071 1727204617.30159: dumping result to json 44071 1727204617.30162: done dumping result, returning 44071 1727204617.30173: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-c964-7471-00000000073e] 44071 1727204617.30176: sending task result for task 127b8e07-fff9-c964-7471-00000000073e skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44071 1727204617.30336: no more pending results, returning what we have 44071 1727204617.30344: results queue empty 44071 1727204617.30345: checking for any_errors_fatal 44071 1727204617.30353: done checking for any_errors_fatal 44071 1727204617.30354: checking for max_fail_percentage 44071 1727204617.30355: done checking for max_fail_percentage 44071 1727204617.30356: checking to see if all hosts have failed and the running result is not ok 44071 1727204617.30357: done checking to see if all hosts have failed 44071 1727204617.30358: getting the remaining hosts for this loop 44071 1727204617.30359: done getting the remaining hosts for this loop 44071 1727204617.30364: getting the next task for host managed-node2 44071 1727204617.30375: done getting next task for host managed-node2 44071 1727204617.30380: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204617.30387: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204617.30405: getting variables 44071 1727204617.30407: in VariableManager get_vars() 44071 1727204617.30446: Calling all_inventory to load vars for managed-node2 44071 1727204617.30449: Calling groups_inventory to load vars for managed-node2 44071 1727204617.30451: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204617.30463: Calling all_plugins_play to load vars for managed-node2 44071 1727204617.30473: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204617.30478: done sending task result for task 127b8e07-fff9-c964-7471-00000000073e 44071 1727204617.30481: WORKER PROCESS EXITING 44071 1727204617.30485: Calling groups_plugins_play to load vars for managed-node2 44071 1727204617.31502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204617.32838: done with get_vars() 44071 1727204617.32867: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.040) 0:00:29.645 ***** 44071 1727204617.32941: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204617.33234: worker is 1 (out of 1 available) 44071 1727204617.33250: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204617.33265: done queuing things up, now waiting for results queue to drain 44071 1727204617.33268: waiting for pending results... 44071 1727204617.33470: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204617.33570: in run() - task 127b8e07-fff9-c964-7471-00000000073f 44071 1727204617.33584: variable 'ansible_search_path' from source: unknown 44071 1727204617.33588: variable 'ansible_search_path' from source: unknown 44071 1727204617.33623: calling self._execute() 44071 1727204617.33704: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204617.33710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204617.33720: variable 'omit' from source: magic vars 44071 1727204617.34031: variable 'ansible_distribution_major_version' from source: facts 44071 1727204617.34048: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204617.34053: variable 'omit' from source: magic vars 44071 1727204617.34098: variable 'omit' from source: magic vars 44071 1727204617.34231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204617.35903: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204617.35960: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204617.35991: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204617.36024: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204617.36045: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204617.36111: variable 'network_provider' from source: set_fact 44071 1727204617.36219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204617.36244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204617.36263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204617.36294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204617.36305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204617.36368: variable 'omit' from source: magic vars 44071 1727204617.36453: variable 'omit' from source: magic vars 44071 1727204617.36531: variable 'network_connections' from source: include params 44071 1727204617.36546: variable 'interface' from source: play vars 44071 1727204617.36593: variable 'interface' from source: play vars 44071 1727204617.36716: variable 'omit' from source: magic vars 44071 1727204617.36723: variable '__lsr_ansible_managed' from source: task vars 44071 1727204617.36772: variable '__lsr_ansible_managed' from source: task vars 44071 1727204617.36927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44071 1727204617.37082: Loaded config def from plugin (lookup/template) 44071 1727204617.37088: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44071 1727204617.37112: File lookup term: get_ansible_managed.j2 44071 1727204617.37115: variable 'ansible_search_path' from source: unknown 44071 1727204617.37121: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44071 1727204617.37133: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44071 1727204617.37148: variable 'ansible_search_path' from source: unknown 44071 1727204617.41743: variable 'ansible_managed' from source: unknown 44071 1727204617.41850: variable 'omit' from source: magic vars 44071 1727204617.41880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204617.41900: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204617.41916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204617.41931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204617.41943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204617.41967: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204617.41970: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204617.41973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204617.42046: Set connection var ansible_connection to ssh 44071 1727204617.42050: Set connection var ansible_timeout to 10 44071 1727204617.42056: Set connection var ansible_pipelining to False 44071 1727204617.42061: Set connection var ansible_shell_type to sh 44071 1727204617.42068: Set connection var ansible_shell_executable to /bin/sh 44071 1727204617.42075: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204617.42100: variable 'ansible_shell_executable' from source: unknown 44071 1727204617.42103: variable 'ansible_connection' from source: unknown 44071 1727204617.42106: variable 'ansible_module_compression' from source: unknown 44071 1727204617.42108: variable 'ansible_shell_type' from source: unknown 44071 1727204617.42111: variable 'ansible_shell_executable' from source: unknown 44071 1727204617.42114: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204617.42116: variable 'ansible_pipelining' from source: unknown 44071 1727204617.42118: variable 'ansible_timeout' from source: unknown 44071 1727204617.42120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204617.42224: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204617.42236: variable 'omit' from source: magic vars 44071 1727204617.42243: starting attempt loop 44071 1727204617.42246: running the handler 44071 1727204617.42256: _low_level_execute_command(): starting 44071 1727204617.42262: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204617.42815: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204617.42820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204617.42824: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204617.42826: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204617.42885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204617.42888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204617.42977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204617.44742: stdout chunk (state=3): >>>/root <<< 44071 1727204617.44847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204617.44913: stderr chunk (state=3): >>><<< 44071 1727204617.44917: stdout chunk (state=3): >>><<< 44071 1727204617.44936: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204617.44950: _low_level_execute_command(): starting 44071 1727204617.44956: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204617.449368-45746-54161755926112 `" && echo ansible-tmp-1727204617.449368-45746-54161755926112="` echo /root/.ansible/tmp/ansible-tmp-1727204617.449368-45746-54161755926112 `" ) && sleep 0' 44071 1727204617.45444: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204617.45481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204617.45485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204617.45487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204617.45489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204617.45543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204617.45546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204617.45551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204617.45619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204617.47591: stdout chunk (state=3): >>>ansible-tmp-1727204617.449368-45746-54161755926112=/root/.ansible/tmp/ansible-tmp-1727204617.449368-45746-54161755926112 <<< 44071 1727204617.47692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204617.47763: stderr chunk (state=3): >>><<< 44071 1727204617.47769: stdout chunk (state=3): >>><<< 44071 1727204617.47815: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204617.449368-45746-54161755926112=/root/.ansible/tmp/ansible-tmp-1727204617.449368-45746-54161755926112 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204617.47833: variable 'ansible_module_compression' from source: unknown 44071 1727204617.47877: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44071 1727204617.47905: variable 'ansible_facts' from source: unknown 44071 1727204617.47976: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204617.449368-45746-54161755926112/AnsiballZ_network_connections.py 44071 1727204617.48092: Sending initial data 44071 1727204617.48096: Sent initial data (166 bytes) 44071 1727204617.48604: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204617.48608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204617.48615: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204617.48617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204617.48663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204617.48669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204617.48671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204617.48747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204617.50344: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204617.50408: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204617.50482: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp_zibdvhp /root/.ansible/tmp/ansible-tmp-1727204617.449368-45746-54161755926112/AnsiballZ_network_connections.py <<< 44071 1727204617.50485: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204617.449368-45746-54161755926112/AnsiballZ_network_connections.py" <<< 44071 1727204617.50551: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp_zibdvhp" to remote "/root/.ansible/tmp/ansible-tmp-1727204617.449368-45746-54161755926112/AnsiballZ_network_connections.py" <<< 44071 1727204617.50557: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204617.449368-45746-54161755926112/AnsiballZ_network_connections.py" <<< 44071 1727204617.51427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204617.51502: stderr chunk (state=3): >>><<< 44071 1727204617.51506: stdout chunk (state=3): >>><<< 44071 1727204617.51525: done transferring module to remote 44071 1727204617.51540: _low_level_execute_command(): starting 44071 1727204617.51548: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204617.449368-45746-54161755926112/ /root/.ansible/tmp/ansible-tmp-1727204617.449368-45746-54161755926112/AnsiballZ_network_connections.py && sleep 0' 44071 1727204617.52048: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204617.52052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204617.52055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204617.52058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204617.52112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204617.52115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204617.52192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204617.54016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204617.54080: stderr chunk (state=3): >>><<< 44071 1727204617.54084: stdout chunk (state=3): >>><<< 44071 1727204617.54099: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204617.54102: _low_level_execute_command(): starting 44071 1727204617.54108: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204617.449368-45746-54161755926112/AnsiballZ_network_connections.py && sleep 0' 44071 1727204617.54621: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204617.54625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204617.54628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204617.54630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204617.54679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204617.54693: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204617.54839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204617.82890: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1b005dda-915c-4416-ac36-5cc535674185\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44071 1727204617.84873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204617.84877: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 44071 1727204617.84880: stdout chunk (state=3): >>><<< 44071 1727204617.84882: stderr chunk (state=3): >>><<< 44071 1727204617.84885: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1b005dda-915c-4416-ac36-5cc535674185\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204617.84909: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'autoconnect': False, 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204617.449368-45746-54161755926112/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204617.84918: _low_level_execute_command(): starting 44071 1727204617.84933: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204617.449368-45746-54161755926112/ > /dev/null 2>&1 && sleep 0' 44071 1727204617.85690: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204617.85738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204617.85753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204617.85775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204617.85880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204617.87973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204617.87977: stdout chunk (state=3): >>><<< 44071 1727204617.87980: stderr chunk (state=3): >>><<< 44071 1727204617.87983: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204617.87985: handler run complete 44071 1727204617.87988: attempt loop complete, returning result 44071 1727204617.87990: _execute() done 44071 1727204617.87992: dumping result to json 44071 1727204617.87994: done dumping result, returning 44071 1727204617.87996: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-c964-7471-00000000073f] 44071 1727204617.87998: sending task result for task 127b8e07-fff9-c964-7471-00000000073f 44071 1727204617.88289: done sending task result for task 127b8e07-fff9-c964-7471-00000000073f 44071 1727204617.88293: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1b005dda-915c-4416-ac36-5cc535674185 44071 1727204617.88413: no more pending results, returning what we have 44071 1727204617.88416: results queue empty 44071 1727204617.88417: checking for any_errors_fatal 44071 1727204617.88423: done checking for any_errors_fatal 44071 1727204617.88424: checking for max_fail_percentage 44071 1727204617.88425: done checking for max_fail_percentage 44071 1727204617.88426: checking to see if all hosts have failed and the running result is not ok 44071 1727204617.88427: done checking to see if all hosts have failed 44071 1727204617.88428: getting the remaining hosts for this loop 44071 1727204617.88429: done getting the remaining hosts for this loop 44071 1727204617.88433: getting the next task for host managed-node2 44071 1727204617.88441: done getting next task for host managed-node2 44071 1727204617.88444: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204617.88449: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204617.88459: getting variables 44071 1727204617.88460: in VariableManager get_vars() 44071 1727204617.88638: Calling all_inventory to load vars for managed-node2 44071 1727204617.88642: Calling groups_inventory to load vars for managed-node2 44071 1727204617.88645: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204617.88656: Calling all_plugins_play to load vars for managed-node2 44071 1727204617.88660: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204617.88663: Calling groups_plugins_play to load vars for managed-node2 44071 1727204617.90522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204617.93117: done with get_vars() 44071 1727204617.93162: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.603) 0:00:30.249 ***** 44071 1727204617.93263: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204617.93772: worker is 1 (out of 1 available) 44071 1727204617.93784: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204617.93798: done queuing things up, now waiting for results queue to drain 44071 1727204617.93799: waiting for pending results... 44071 1727204617.94389: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204617.94396: in run() - task 127b8e07-fff9-c964-7471-000000000740 44071 1727204617.94400: variable 'ansible_search_path' from source: unknown 44071 1727204617.94403: variable 'ansible_search_path' from source: unknown 44071 1727204617.94409: calling self._execute() 44071 1727204617.94411: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204617.94415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204617.94418: variable 'omit' from source: magic vars 44071 1727204617.94832: variable 'ansible_distribution_major_version' from source: facts 44071 1727204617.94846: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204617.95000: variable 'network_state' from source: role '' defaults 44071 1727204617.95072: Evaluated conditional (network_state != {}): False 44071 1727204617.95076: when evaluation is False, skipping this task 44071 1727204617.95079: _execute() done 44071 1727204617.95372: dumping result to json 44071 1727204617.95376: done dumping result, returning 44071 1727204617.95379: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-c964-7471-000000000740] 44071 1727204617.95381: sending task result for task 127b8e07-fff9-c964-7471-000000000740 44071 1727204617.95457: done sending task result for task 127b8e07-fff9-c964-7471-000000000740 44071 1727204617.95462: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204617.95522: no more pending results, returning what we have 44071 1727204617.95526: results queue empty 44071 1727204617.95527: checking for any_errors_fatal 44071 1727204617.95536: done checking for any_errors_fatal 44071 1727204617.95537: checking for max_fail_percentage 44071 1727204617.95539: done checking for max_fail_percentage 44071 1727204617.95540: checking to see if all hosts have failed and the running result is not ok 44071 1727204617.95541: done checking to see if all hosts have failed 44071 1727204617.95542: getting the remaining hosts for this loop 44071 1727204617.95543: done getting the remaining hosts for this loop 44071 1727204617.95548: getting the next task for host managed-node2 44071 1727204617.95556: done getting next task for host managed-node2 44071 1727204617.95561: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204617.95569: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204617.95589: getting variables 44071 1727204617.95591: in VariableManager get_vars() 44071 1727204617.95780: Calling all_inventory to load vars for managed-node2 44071 1727204617.95784: Calling groups_inventory to load vars for managed-node2 44071 1727204617.95786: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204617.95798: Calling all_plugins_play to load vars for managed-node2 44071 1727204617.95801: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204617.95805: Calling groups_plugins_play to load vars for managed-node2 44071 1727204617.99637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204618.01943: done with get_vars() 44071 1727204618.01989: done getting variables 44071 1727204618.02075: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.088) 0:00:30.337 ***** 44071 1727204618.02117: entering _queue_task() for managed-node2/debug 44071 1727204618.02555: worker is 1 (out of 1 available) 44071 1727204618.02574: exiting _queue_task() for managed-node2/debug 44071 1727204618.02593: done queuing things up, now waiting for results queue to drain 44071 1727204618.02595: waiting for pending results... 44071 1727204618.02856: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204618.02999: in run() - task 127b8e07-fff9-c964-7471-000000000741 44071 1727204618.03015: variable 'ansible_search_path' from source: unknown 44071 1727204618.03019: variable 'ansible_search_path' from source: unknown 44071 1727204618.03069: calling self._execute() 44071 1727204618.03179: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204618.03184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204618.03196: variable 'omit' from source: magic vars 44071 1727204618.03647: variable 'ansible_distribution_major_version' from source: facts 44071 1727204618.03655: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204618.03675: variable 'omit' from source: magic vars 44071 1727204618.03743: variable 'omit' from source: magic vars 44071 1727204618.03972: variable 'omit' from source: magic vars 44071 1727204618.03977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204618.03980: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204618.03983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204618.03985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204618.03987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204618.03989: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204618.03991: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204618.03993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204618.04099: Set connection var ansible_connection to ssh 44071 1727204618.04110: Set connection var ansible_timeout to 10 44071 1727204618.04113: Set connection var ansible_pipelining to False 44071 1727204618.04116: Set connection var ansible_shell_type to sh 44071 1727204618.04122: Set connection var ansible_shell_executable to /bin/sh 44071 1727204618.04135: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204618.04161: variable 'ansible_shell_executable' from source: unknown 44071 1727204618.04165: variable 'ansible_connection' from source: unknown 44071 1727204618.04171: variable 'ansible_module_compression' from source: unknown 44071 1727204618.04174: variable 'ansible_shell_type' from source: unknown 44071 1727204618.04177: variable 'ansible_shell_executable' from source: unknown 44071 1727204618.04179: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204618.04181: variable 'ansible_pipelining' from source: unknown 44071 1727204618.04184: variable 'ansible_timeout' from source: unknown 44071 1727204618.04189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204618.04338: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204618.04371: variable 'omit' from source: magic vars 44071 1727204618.04375: starting attempt loop 44071 1727204618.04378: running the handler 44071 1727204618.04512: variable '__network_connections_result' from source: set_fact 44071 1727204618.04657: handler run complete 44071 1727204618.04661: attempt loop complete, returning result 44071 1727204618.04663: _execute() done 44071 1727204618.04667: dumping result to json 44071 1727204618.04669: done dumping result, returning 44071 1727204618.04672: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-c964-7471-000000000741] 44071 1727204618.04674: sending task result for task 127b8e07-fff9-c964-7471-000000000741 44071 1727204618.04746: done sending task result for task 127b8e07-fff9-c964-7471-000000000741 44071 1727204618.04749: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1b005dda-915c-4416-ac36-5cc535674185" ] } 44071 1727204618.04829: no more pending results, returning what we have 44071 1727204618.04832: results queue empty 44071 1727204618.04833: checking for any_errors_fatal 44071 1727204618.04842: done checking for any_errors_fatal 44071 1727204618.04843: checking for max_fail_percentage 44071 1727204618.04844: done checking for max_fail_percentage 44071 1727204618.04845: checking to see if all hosts have failed and the running result is not ok 44071 1727204618.04846: done checking to see if all hosts have failed 44071 1727204618.04847: getting the remaining hosts for this loop 44071 1727204618.04848: done getting the remaining hosts for this loop 44071 1727204618.04853: getting the next task for host managed-node2 44071 1727204618.04860: done getting next task for host managed-node2 44071 1727204618.04864: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204618.04874: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204618.04886: getting variables 44071 1727204618.04888: in VariableManager get_vars() 44071 1727204618.04922: Calling all_inventory to load vars for managed-node2 44071 1727204618.04924: Calling groups_inventory to load vars for managed-node2 44071 1727204618.04926: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204618.04935: Calling all_plugins_play to load vars for managed-node2 44071 1727204618.04938: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204618.04943: Calling groups_plugins_play to load vars for managed-node2 44071 1727204618.06906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204618.09142: done with get_vars() 44071 1727204618.09183: done getting variables 44071 1727204618.09252: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.071) 0:00:30.409 ***** 44071 1727204618.09299: entering _queue_task() for managed-node2/debug 44071 1727204618.09889: worker is 1 (out of 1 available) 44071 1727204618.09902: exiting _queue_task() for managed-node2/debug 44071 1727204618.09914: done queuing things up, now waiting for results queue to drain 44071 1727204618.09916: waiting for pending results... 44071 1727204618.10287: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204618.10294: in run() - task 127b8e07-fff9-c964-7471-000000000742 44071 1727204618.10297: variable 'ansible_search_path' from source: unknown 44071 1727204618.10300: variable 'ansible_search_path' from source: unknown 44071 1727204618.10310: calling self._execute() 44071 1727204618.10417: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204618.10422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204618.10434: variable 'omit' from source: magic vars 44071 1727204618.10862: variable 'ansible_distribution_major_version' from source: facts 44071 1727204618.10876: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204618.10882: variable 'omit' from source: magic vars 44071 1727204618.10964: variable 'omit' from source: magic vars 44071 1727204618.11005: variable 'omit' from source: magic vars 44071 1727204618.11060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204618.11101: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204618.11121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204618.11149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204618.11163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204618.11197: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204618.11201: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204618.11204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204618.11326: Set connection var ansible_connection to ssh 44071 1727204618.11332: Set connection var ansible_timeout to 10 44071 1727204618.11341: Set connection var ansible_pipelining to False 44071 1727204618.11355: Set connection var ansible_shell_type to sh 44071 1727204618.11361: Set connection var ansible_shell_executable to /bin/sh 44071 1727204618.11371: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204618.11396: variable 'ansible_shell_executable' from source: unknown 44071 1727204618.11400: variable 'ansible_connection' from source: unknown 44071 1727204618.11403: variable 'ansible_module_compression' from source: unknown 44071 1727204618.11405: variable 'ansible_shell_type' from source: unknown 44071 1727204618.11408: variable 'ansible_shell_executable' from source: unknown 44071 1727204618.11410: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204618.11473: variable 'ansible_pipelining' from source: unknown 44071 1727204618.11477: variable 'ansible_timeout' from source: unknown 44071 1727204618.11479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204618.11596: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204618.11607: variable 'omit' from source: magic vars 44071 1727204618.11613: starting attempt loop 44071 1727204618.11616: running the handler 44071 1727204618.11673: variable '__network_connections_result' from source: set_fact 44071 1727204618.11770: variable '__network_connections_result' from source: set_fact 44071 1727204618.11915: handler run complete 44071 1727204618.11999: attempt loop complete, returning result 44071 1727204618.12003: _execute() done 44071 1727204618.12010: dumping result to json 44071 1727204618.12012: done dumping result, returning 44071 1727204618.12015: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-c964-7471-000000000742] 44071 1727204618.12017: sending task result for task 127b8e07-fff9-c964-7471-000000000742 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1b005dda-915c-4416-ac36-5cc535674185\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1b005dda-915c-4416-ac36-5cc535674185" ] } } 44071 1727204618.12338: no more pending results, returning what we have 44071 1727204618.12344: results queue empty 44071 1727204618.12345: checking for any_errors_fatal 44071 1727204618.12353: done checking for any_errors_fatal 44071 1727204618.12354: checking for max_fail_percentage 44071 1727204618.12356: done checking for max_fail_percentage 44071 1727204618.12357: checking to see if all hosts have failed and the running result is not ok 44071 1727204618.12358: done checking to see if all hosts have failed 44071 1727204618.12358: getting the remaining hosts for this loop 44071 1727204618.12360: done getting the remaining hosts for this loop 44071 1727204618.12366: getting the next task for host managed-node2 44071 1727204618.12375: done getting next task for host managed-node2 44071 1727204618.12380: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204618.12385: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204618.12397: getting variables 44071 1727204618.12399: in VariableManager get_vars() 44071 1727204618.12439: Calling all_inventory to load vars for managed-node2 44071 1727204618.12442: Calling groups_inventory to load vars for managed-node2 44071 1727204618.12452: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204618.12465: Calling all_plugins_play to load vars for managed-node2 44071 1727204618.12674: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204618.12679: Calling groups_plugins_play to load vars for managed-node2 44071 1727204618.13311: done sending task result for task 127b8e07-fff9-c964-7471-000000000742 44071 1727204618.13315: WORKER PROCESS EXITING 44071 1727204618.14452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204618.16724: done with get_vars() 44071 1727204618.16770: done getting variables 44071 1727204618.16850: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.075) 0:00:30.485 ***** 44071 1727204618.16895: entering _queue_task() for managed-node2/debug 44071 1727204618.17313: worker is 1 (out of 1 available) 44071 1727204618.17328: exiting _queue_task() for managed-node2/debug 44071 1727204618.17341: done queuing things up, now waiting for results queue to drain 44071 1727204618.17343: waiting for pending results... 44071 1727204618.17672: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204618.17851: in run() - task 127b8e07-fff9-c964-7471-000000000743 44071 1727204618.17876: variable 'ansible_search_path' from source: unknown 44071 1727204618.17902: variable 'ansible_search_path' from source: unknown 44071 1727204618.17935: calling self._execute() 44071 1727204618.18121: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204618.18125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204618.18128: variable 'omit' from source: magic vars 44071 1727204618.18539: variable 'ansible_distribution_major_version' from source: facts 44071 1727204618.18569: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204618.18717: variable 'network_state' from source: role '' defaults 44071 1727204618.18735: Evaluated conditional (network_state != {}): False 44071 1727204618.18744: when evaluation is False, skipping this task 44071 1727204618.18751: _execute() done 44071 1727204618.18759: dumping result to json 44071 1727204618.18773: done dumping result, returning 44071 1727204618.18795: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-c964-7471-000000000743] 44071 1727204618.18807: sending task result for task 127b8e07-fff9-c964-7471-000000000743 44071 1727204618.19272: done sending task result for task 127b8e07-fff9-c964-7471-000000000743 44071 1727204618.19276: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 44071 1727204618.19324: no more pending results, returning what we have 44071 1727204618.19327: results queue empty 44071 1727204618.19328: checking for any_errors_fatal 44071 1727204618.19336: done checking for any_errors_fatal 44071 1727204618.19337: checking for max_fail_percentage 44071 1727204618.19338: done checking for max_fail_percentage 44071 1727204618.19339: checking to see if all hosts have failed and the running result is not ok 44071 1727204618.19340: done checking to see if all hosts have failed 44071 1727204618.19341: getting the remaining hosts for this loop 44071 1727204618.19342: done getting the remaining hosts for this loop 44071 1727204618.19346: getting the next task for host managed-node2 44071 1727204618.19354: done getting next task for host managed-node2 44071 1727204618.19359: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204618.19364: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204618.19386: getting variables 44071 1727204618.19388: in VariableManager get_vars() 44071 1727204618.19426: Calling all_inventory to load vars for managed-node2 44071 1727204618.19429: Calling groups_inventory to load vars for managed-node2 44071 1727204618.19432: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204618.19442: Calling all_plugins_play to load vars for managed-node2 44071 1727204618.19446: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204618.19449: Calling groups_plugins_play to load vars for managed-node2 44071 1727204618.22504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204618.26463: done with get_vars() 44071 1727204618.26520: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.097) 0:00:30.582 ***** 44071 1727204618.26643: entering _queue_task() for managed-node2/ping 44071 1727204618.27055: worker is 1 (out of 1 available) 44071 1727204618.27072: exiting _queue_task() for managed-node2/ping 44071 1727204618.27087: done queuing things up, now waiting for results queue to drain 44071 1727204618.27089: waiting for pending results... 44071 1727204618.27438: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204618.27625: in run() - task 127b8e07-fff9-c964-7471-000000000744 44071 1727204618.27653: variable 'ansible_search_path' from source: unknown 44071 1727204618.27663: variable 'ansible_search_path' from source: unknown 44071 1727204618.27722: calling self._execute() 44071 1727204618.27842: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204618.27856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204618.27873: variable 'omit' from source: magic vars 44071 1727204618.28331: variable 'ansible_distribution_major_version' from source: facts 44071 1727204618.28368: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204618.28373: variable 'omit' from source: magic vars 44071 1727204618.28475: variable 'omit' from source: magic vars 44071 1727204618.28498: variable 'omit' from source: magic vars 44071 1727204618.28549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204618.28606: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204618.28694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204618.28698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204618.28700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204618.28713: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204618.28722: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204618.28730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204618.28857: Set connection var ansible_connection to ssh 44071 1727204618.28872: Set connection var ansible_timeout to 10 44071 1727204618.28884: Set connection var ansible_pipelining to False 44071 1727204618.28894: Set connection var ansible_shell_type to sh 44071 1727204618.28910: Set connection var ansible_shell_executable to /bin/sh 44071 1727204618.28924: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204618.29020: variable 'ansible_shell_executable' from source: unknown 44071 1727204618.29023: variable 'ansible_connection' from source: unknown 44071 1727204618.29026: variable 'ansible_module_compression' from source: unknown 44071 1727204618.29029: variable 'ansible_shell_type' from source: unknown 44071 1727204618.29031: variable 'ansible_shell_executable' from source: unknown 44071 1727204618.29033: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204618.29035: variable 'ansible_pipelining' from source: unknown 44071 1727204618.29037: variable 'ansible_timeout' from source: unknown 44071 1727204618.29039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204618.29252: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204618.29273: variable 'omit' from source: magic vars 44071 1727204618.29284: starting attempt loop 44071 1727204618.29291: running the handler 44071 1727204618.29312: _low_level_execute_command(): starting 44071 1727204618.29324: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204618.30776: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204618.30893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204618.30994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204618.31081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204618.32840: stdout chunk (state=3): >>>/root <<< 44071 1727204618.33093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204618.33097: stdout chunk (state=3): >>><<< 44071 1727204618.33099: stderr chunk (state=3): >>><<< 44071 1727204618.33214: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204618.33218: _low_level_execute_command(): starting 44071 1727204618.33224: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204618.3312042-45775-243326031837530 `" && echo ansible-tmp-1727204618.3312042-45775-243326031837530="` echo /root/.ansible/tmp/ansible-tmp-1727204618.3312042-45775-243326031837530 `" ) && sleep 0' 44071 1727204618.34110: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204618.34145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204618.34162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204618.34279: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204618.34572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204618.36552: stdout chunk (state=3): >>>ansible-tmp-1727204618.3312042-45775-243326031837530=/root/.ansible/tmp/ansible-tmp-1727204618.3312042-45775-243326031837530 <<< 44071 1727204618.36742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204618.36867: stdout chunk (state=3): >>><<< 44071 1727204618.36871: stderr chunk (state=3): >>><<< 44071 1727204618.36875: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204618.3312042-45775-243326031837530=/root/.ansible/tmp/ansible-tmp-1727204618.3312042-45775-243326031837530 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204618.36884: variable 'ansible_module_compression' from source: unknown 44071 1727204618.36946: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44071 1727204618.37036: variable 'ansible_facts' from source: unknown 44071 1727204618.37478: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204618.3312042-45775-243326031837530/AnsiballZ_ping.py 44071 1727204618.37860: Sending initial data 44071 1727204618.37876: Sent initial data (153 bytes) 44071 1727204618.38539: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204618.38553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204618.38616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204618.38704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204618.38708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204618.38859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204618.38943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204618.40536: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204618.40546: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 44071 1727204618.40555: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 44071 1727204618.40564: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 44071 1727204618.40580: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 44071 1727204618.40583: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 44071 1727204618.40594: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204618.40699: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204618.40764: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpkdayi0_n /root/.ansible/tmp/ansible-tmp-1727204618.3312042-45775-243326031837530/AnsiballZ_ping.py <<< 44071 1727204618.40777: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204618.3312042-45775-243326031837530/AnsiballZ_ping.py" <<< 44071 1727204618.40890: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpkdayi0_n" to remote "/root/.ansible/tmp/ansible-tmp-1727204618.3312042-45775-243326031837530/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204618.3312042-45775-243326031837530/AnsiballZ_ping.py" <<< 44071 1727204618.42317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204618.42321: stderr chunk (state=3): >>><<< 44071 1727204618.42324: stdout chunk (state=3): >>><<< 44071 1727204618.42349: done transferring module to remote 44071 1727204618.42352: _low_level_execute_command(): starting 44071 1727204618.42355: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204618.3312042-45775-243326031837530/ /root/.ansible/tmp/ansible-tmp-1727204618.3312042-45775-243326031837530/AnsiballZ_ping.py && sleep 0' 44071 1727204618.43237: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204618.43289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204618.43300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204618.43317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204618.43330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204618.43342: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204618.43351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204618.43368: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204618.43384: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204618.43391: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204618.43400: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204618.43408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204618.43451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204618.43561: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204618.43723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204618.43819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204618.45736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204618.45741: stdout chunk (state=3): >>><<< 44071 1727204618.45785: stderr chunk (state=3): >>><<< 44071 1727204618.45789: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204618.45792: _low_level_execute_command(): starting 44071 1727204618.45795: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204618.3312042-45775-243326031837530/AnsiballZ_ping.py && sleep 0' 44071 1727204618.46592: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204618.46872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204618.46877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204618.46879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204618.46910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204618.47145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204618.47253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204618.64934: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44071 1727204618.66377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204618.66381: stdout chunk (state=3): >>><<< 44071 1727204618.66384: stderr chunk (state=3): >>><<< 44071 1727204618.66387: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204618.66391: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204618.3312042-45775-243326031837530/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204618.66394: _low_level_execute_command(): starting 44071 1727204618.66396: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204618.3312042-45775-243326031837530/ > /dev/null 2>&1 && sleep 0' 44071 1727204618.67048: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204618.67053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204618.67068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204618.67083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204618.67095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204618.67102: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204618.67112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204618.67125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204618.67137: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204618.67143: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204618.67146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204618.67246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204618.67250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204618.67259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204618.67262: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204618.67264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204618.67270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204618.67272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204618.67297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204618.67399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204618.69549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204618.69553: stdout chunk (state=3): >>><<< 44071 1727204618.69556: stderr chunk (state=3): >>><<< 44071 1727204618.69774: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204618.69779: handler run complete 44071 1727204618.69782: attempt loop complete, returning result 44071 1727204618.69784: _execute() done 44071 1727204618.69787: dumping result to json 44071 1727204618.69789: done dumping result, returning 44071 1727204618.69791: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-c964-7471-000000000744] 44071 1727204618.69793: sending task result for task 127b8e07-fff9-c964-7471-000000000744 ok: [managed-node2] => { "changed": false, "ping": "pong" } 44071 1727204618.70103: no more pending results, returning what we have 44071 1727204618.70108: results queue empty 44071 1727204618.70109: checking for any_errors_fatal 44071 1727204618.70118: done checking for any_errors_fatal 44071 1727204618.70119: checking for max_fail_percentage 44071 1727204618.70121: done checking for max_fail_percentage 44071 1727204618.70122: checking to see if all hosts have failed and the running result is not ok 44071 1727204618.70123: done checking to see if all hosts have failed 44071 1727204618.70126: getting the remaining hosts for this loop 44071 1727204618.70128: done getting the remaining hosts for this loop 44071 1727204618.70133: getting the next task for host managed-node2 44071 1727204618.70146: done getting next task for host managed-node2 44071 1727204618.70148: ^ task is: TASK: meta (role_complete) 44071 1727204618.70154: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204618.70170: getting variables 44071 1727204618.70172: in VariableManager get_vars() 44071 1727204618.70218: Calling all_inventory to load vars for managed-node2 44071 1727204618.70222: Calling groups_inventory to load vars for managed-node2 44071 1727204618.70224: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204618.70238: Calling all_plugins_play to load vars for managed-node2 44071 1727204618.70241: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204618.70245: Calling groups_plugins_play to load vars for managed-node2 44071 1727204618.71027: done sending task result for task 127b8e07-fff9-c964-7471-000000000744 44071 1727204618.71032: WORKER PROCESS EXITING 44071 1727204618.74964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204618.87553: done with get_vars() 44071 1727204618.87606: done getting variables 44071 1727204618.87703: done queuing things up, now waiting for results queue to drain 44071 1727204618.87705: results queue empty 44071 1727204618.87706: checking for any_errors_fatal 44071 1727204618.87710: done checking for any_errors_fatal 44071 1727204618.87710: checking for max_fail_percentage 44071 1727204618.87712: done checking for max_fail_percentage 44071 1727204618.87713: checking to see if all hosts have failed and the running result is not ok 44071 1727204618.87714: done checking to see if all hosts have failed 44071 1727204618.87715: getting the remaining hosts for this loop 44071 1727204618.87716: done getting the remaining hosts for this loop 44071 1727204618.87718: getting the next task for host managed-node2 44071 1727204618.87723: done getting next task for host managed-node2 44071 1727204618.87726: ^ task is: TASK: Show result 44071 1727204618.87728: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204618.87731: getting variables 44071 1727204618.87732: in VariableManager get_vars() 44071 1727204618.87747: Calling all_inventory to load vars for managed-node2 44071 1727204618.87749: Calling groups_inventory to load vars for managed-node2 44071 1727204618.87752: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204618.87757: Calling all_plugins_play to load vars for managed-node2 44071 1727204618.87760: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204618.87762: Calling groups_plugins_play to load vars for managed-node2 44071 1727204618.89582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204618.91908: done with get_vars() 44071 1727204618.91950: done getting variables 44071 1727204618.92002: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:15 Tuesday 24 September 2024 15:03:38 -0400 (0:00:00.653) 0:00:31.236 ***** 44071 1727204618.92046: entering _queue_task() for managed-node2/debug 44071 1727204618.92614: worker is 1 (out of 1 available) 44071 1727204618.92629: exiting _queue_task() for managed-node2/debug 44071 1727204618.92644: done queuing things up, now waiting for results queue to drain 44071 1727204618.92646: waiting for pending results... 44071 1727204618.93111: running TaskExecutor() for managed-node2/TASK: Show result 44071 1727204618.93558: in run() - task 127b8e07-fff9-c964-7471-0000000006b2 44071 1727204618.93594: variable 'ansible_search_path' from source: unknown 44071 1727204618.93631: variable 'ansible_search_path' from source: unknown 44071 1727204618.93683: calling self._execute() 44071 1727204618.93793: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204618.93806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204618.93839: variable 'omit' from source: magic vars 44071 1727204618.94250: variable 'ansible_distribution_major_version' from source: facts 44071 1727204618.94272: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204618.94283: variable 'omit' from source: magic vars 44071 1727204618.94342: variable 'omit' from source: magic vars 44071 1727204618.94386: variable 'omit' from source: magic vars 44071 1727204618.94436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204618.94482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204618.94572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204618.94575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204618.94578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204618.94583: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204618.94592: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204618.94599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204618.94707: Set connection var ansible_connection to ssh 44071 1727204618.94720: Set connection var ansible_timeout to 10 44071 1727204618.94730: Set connection var ansible_pipelining to False 44071 1727204618.94741: Set connection var ansible_shell_type to sh 44071 1727204618.94751: Set connection var ansible_shell_executable to /bin/sh 44071 1727204618.94762: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204618.94793: variable 'ansible_shell_executable' from source: unknown 44071 1727204618.94800: variable 'ansible_connection' from source: unknown 44071 1727204618.94807: variable 'ansible_module_compression' from source: unknown 44071 1727204618.94970: variable 'ansible_shell_type' from source: unknown 44071 1727204618.94974: variable 'ansible_shell_executable' from source: unknown 44071 1727204618.94977: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204618.94980: variable 'ansible_pipelining' from source: unknown 44071 1727204618.94983: variable 'ansible_timeout' from source: unknown 44071 1727204618.94986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204618.95015: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204618.95034: variable 'omit' from source: magic vars 44071 1727204618.95045: starting attempt loop 44071 1727204618.95051: running the handler 44071 1727204618.95108: variable '__network_connections_result' from source: set_fact 44071 1727204618.95200: variable '__network_connections_result' from source: set_fact 44071 1727204618.95331: handler run complete 44071 1727204618.95368: attempt loop complete, returning result 44071 1727204618.95377: _execute() done 44071 1727204618.95386: dumping result to json 44071 1727204618.95400: done dumping result, returning 44071 1727204618.95455: done running TaskExecutor() for managed-node2/TASK: Show result [127b8e07-fff9-c964-7471-0000000006b2] 44071 1727204618.95468: sending task result for task 127b8e07-fff9-c964-7471-0000000006b2 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1b005dda-915c-4416-ac36-5cc535674185\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1b005dda-915c-4416-ac36-5cc535674185" ] } } 44071 1727204618.95705: no more pending results, returning what we have 44071 1727204618.95709: results queue empty 44071 1727204618.95709: checking for any_errors_fatal 44071 1727204618.95711: done checking for any_errors_fatal 44071 1727204618.95712: checking for max_fail_percentage 44071 1727204618.95714: done checking for max_fail_percentage 44071 1727204618.95715: checking to see if all hosts have failed and the running result is not ok 44071 1727204618.95715: done checking to see if all hosts have failed 44071 1727204618.95716: getting the remaining hosts for this loop 44071 1727204618.95718: done getting the remaining hosts for this loop 44071 1727204618.95723: getting the next task for host managed-node2 44071 1727204618.95732: done getting next task for host managed-node2 44071 1727204618.95736: ^ task is: TASK: Asserts 44071 1727204618.95741: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204618.95747: getting variables 44071 1727204618.95748: in VariableManager get_vars() 44071 1727204618.96023: Calling all_inventory to load vars for managed-node2 44071 1727204618.96026: Calling groups_inventory to load vars for managed-node2 44071 1727204618.96030: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204618.96044: Calling all_plugins_play to load vars for managed-node2 44071 1727204618.96046: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204618.96050: Calling groups_plugins_play to load vars for managed-node2 44071 1727204618.96826: done sending task result for task 127b8e07-fff9-c964-7471-0000000006b2 44071 1727204618.96831: WORKER PROCESS EXITING 44071 1727204618.99738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204619.03290: done with get_vars() 44071 1727204619.03336: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 15:03:39 -0400 (0:00:00.114) 0:00:31.351 ***** 44071 1727204619.03472: entering _queue_task() for managed-node2/include_tasks 44071 1727204619.04305: worker is 1 (out of 1 available) 44071 1727204619.04457: exiting _queue_task() for managed-node2/include_tasks 44071 1727204619.04471: done queuing things up, now waiting for results queue to drain 44071 1727204619.04473: waiting for pending results... 44071 1727204619.04931: running TaskExecutor() for managed-node2/TASK: Asserts 44071 1727204619.05070: in run() - task 127b8e07-fff9-c964-7471-0000000005b9 44071 1727204619.05172: variable 'ansible_search_path' from source: unknown 44071 1727204619.05183: variable 'ansible_search_path' from source: unknown 44071 1727204619.05238: variable 'lsr_assert' from source: include params 44071 1727204619.05774: variable 'lsr_assert' from source: include params 44071 1727204619.06073: variable 'omit' from source: magic vars 44071 1727204619.06213: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204619.06577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204619.06581: variable 'omit' from source: magic vars 44071 1727204619.06953: variable 'ansible_distribution_major_version' from source: facts 44071 1727204619.07043: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204619.07057: variable 'item' from source: unknown 44071 1727204619.07144: variable 'item' from source: unknown 44071 1727204619.07192: variable 'item' from source: unknown 44071 1727204619.07269: variable 'item' from source: unknown 44071 1727204619.07813: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204619.07819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204619.07822: variable 'omit' from source: magic vars 44071 1727204619.07876: variable 'ansible_distribution_major_version' from source: facts 44071 1727204619.08041: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204619.08053: variable 'item' from source: unknown 44071 1727204619.08133: variable 'item' from source: unknown 44071 1727204619.08174: variable 'item' from source: unknown 44071 1727204619.08248: variable 'item' from source: unknown 44071 1727204619.08350: dumping result to json 44071 1727204619.08361: done dumping result, returning 44071 1727204619.08377: done running TaskExecutor() for managed-node2/TASK: Asserts [127b8e07-fff9-c964-7471-0000000005b9] 44071 1727204619.08389: sending task result for task 127b8e07-fff9-c964-7471-0000000005b9 44071 1727204619.08522: done sending task result for task 127b8e07-fff9-c964-7471-0000000005b9 44071 1727204619.08527: WORKER PROCESS EXITING 44071 1727204619.08557: no more pending results, returning what we have 44071 1727204619.08563: in VariableManager get_vars() 44071 1727204619.08609: Calling all_inventory to load vars for managed-node2 44071 1727204619.08612: Calling groups_inventory to load vars for managed-node2 44071 1727204619.08616: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204619.08632: Calling all_plugins_play to load vars for managed-node2 44071 1727204619.08635: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204619.08638: Calling groups_plugins_play to load vars for managed-node2 44071 1727204619.11462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204619.14126: done with get_vars() 44071 1727204619.14159: variable 'ansible_search_path' from source: unknown 44071 1727204619.14161: variable 'ansible_search_path' from source: unknown 44071 1727204619.14224: variable 'ansible_search_path' from source: unknown 44071 1727204619.14226: variable 'ansible_search_path' from source: unknown 44071 1727204619.14260: we have included files to process 44071 1727204619.14261: generating all_blocks data 44071 1727204619.14264: done generating all_blocks data 44071 1727204619.14270: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44071 1727204619.14272: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44071 1727204619.14275: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44071 1727204619.14398: in VariableManager get_vars() 44071 1727204619.14421: done with get_vars() 44071 1727204619.14550: done processing included file 44071 1727204619.14552: iterating over new_blocks loaded from include file 44071 1727204619.14553: in VariableManager get_vars() 44071 1727204619.14570: done with get_vars() 44071 1727204619.14572: filtering new block on tags 44071 1727204619.14609: done filtering new block on tags 44071 1727204619.14612: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 => (item=tasks/assert_device_absent.yml) 44071 1727204619.14617: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 44071 1727204619.14618: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 44071 1727204619.14621: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 44071 1727204619.14734: in VariableManager get_vars() 44071 1727204619.14755: done with get_vars() 44071 1727204619.15018: done processing included file 44071 1727204619.15020: iterating over new_blocks loaded from include file 44071 1727204619.15022: in VariableManager get_vars() 44071 1727204619.15037: done with get_vars() 44071 1727204619.15039: filtering new block on tags 44071 1727204619.15097: done filtering new block on tags 44071 1727204619.15100: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=tasks/assert_profile_present.yml) 44071 1727204619.15105: extending task lists for all hosts with included blocks 44071 1727204619.16313: done extending task lists 44071 1727204619.16316: done processing included files 44071 1727204619.16317: results queue empty 44071 1727204619.16317: checking for any_errors_fatal 44071 1727204619.16324: done checking for any_errors_fatal 44071 1727204619.16325: checking for max_fail_percentage 44071 1727204619.16326: done checking for max_fail_percentage 44071 1727204619.16327: checking to see if all hosts have failed and the running result is not ok 44071 1727204619.16328: done checking to see if all hosts have failed 44071 1727204619.16328: getting the remaining hosts for this loop 44071 1727204619.16330: done getting the remaining hosts for this loop 44071 1727204619.16332: getting the next task for host managed-node2 44071 1727204619.16337: done getting next task for host managed-node2 44071 1727204619.16340: ^ task is: TASK: Include the task 'get_interface_stat.yml' 44071 1727204619.16343: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204619.16352: getting variables 44071 1727204619.16353: in VariableManager get_vars() 44071 1727204619.16370: Calling all_inventory to load vars for managed-node2 44071 1727204619.16373: Calling groups_inventory to load vars for managed-node2 44071 1727204619.16376: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204619.16384: Calling all_plugins_play to load vars for managed-node2 44071 1727204619.16386: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204619.16390: Calling groups_plugins_play to load vars for managed-node2 44071 1727204619.17928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204619.20098: done with get_vars() 44071 1727204619.20137: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:03:39 -0400 (0:00:00.167) 0:00:31.518 ***** 44071 1727204619.20230: entering _queue_task() for managed-node2/include_tasks 44071 1727204619.20624: worker is 1 (out of 1 available) 44071 1727204619.20639: exiting _queue_task() for managed-node2/include_tasks 44071 1727204619.20655: done queuing things up, now waiting for results queue to drain 44071 1727204619.20657: waiting for pending results... 44071 1727204619.21091: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 44071 1727204619.21131: in run() - task 127b8e07-fff9-c964-7471-0000000008a8 44071 1727204619.21155: variable 'ansible_search_path' from source: unknown 44071 1727204619.21163: variable 'ansible_search_path' from source: unknown 44071 1727204619.21217: calling self._execute() 44071 1727204619.21336: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204619.21349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204619.21363: variable 'omit' from source: magic vars 44071 1727204619.21860: variable 'ansible_distribution_major_version' from source: facts 44071 1727204619.21865: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204619.21869: _execute() done 44071 1727204619.21873: dumping result to json 44071 1727204619.21876: done dumping result, returning 44071 1727204619.21878: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-c964-7471-0000000008a8] 44071 1727204619.21883: sending task result for task 127b8e07-fff9-c964-7471-0000000008a8 44071 1727204619.22143: no more pending results, returning what we have 44071 1727204619.22151: in VariableManager get_vars() 44071 1727204619.22196: Calling all_inventory to load vars for managed-node2 44071 1727204619.22200: Calling groups_inventory to load vars for managed-node2 44071 1727204619.22204: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204619.22222: Calling all_plugins_play to load vars for managed-node2 44071 1727204619.22225: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204619.22229: Calling groups_plugins_play to load vars for managed-node2 44071 1727204619.22786: done sending task result for task 127b8e07-fff9-c964-7471-0000000008a8 44071 1727204619.22790: WORKER PROCESS EXITING 44071 1727204619.24318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204619.26423: done with get_vars() 44071 1727204619.26455: variable 'ansible_search_path' from source: unknown 44071 1727204619.26456: variable 'ansible_search_path' from source: unknown 44071 1727204619.26470: variable 'item' from source: include params 44071 1727204619.26595: variable 'item' from source: include params 44071 1727204619.26633: we have included files to process 44071 1727204619.26634: generating all_blocks data 44071 1727204619.26635: done generating all_blocks data 44071 1727204619.26636: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204619.26638: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204619.26640: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204619.26834: done processing included file 44071 1727204619.26836: iterating over new_blocks loaded from include file 44071 1727204619.26837: in VariableManager get_vars() 44071 1727204619.26854: done with get_vars() 44071 1727204619.26856: filtering new block on tags 44071 1727204619.26887: done filtering new block on tags 44071 1727204619.26889: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 44071 1727204619.26895: extending task lists for all hosts with included blocks 44071 1727204619.27064: done extending task lists 44071 1727204619.27067: done processing included files 44071 1727204619.27067: results queue empty 44071 1727204619.27068: checking for any_errors_fatal 44071 1727204619.27072: done checking for any_errors_fatal 44071 1727204619.27072: checking for max_fail_percentage 44071 1727204619.27073: done checking for max_fail_percentage 44071 1727204619.27074: checking to see if all hosts have failed and the running result is not ok 44071 1727204619.27075: done checking to see if all hosts have failed 44071 1727204619.27076: getting the remaining hosts for this loop 44071 1727204619.27077: done getting the remaining hosts for this loop 44071 1727204619.27080: getting the next task for host managed-node2 44071 1727204619.27084: done getting next task for host managed-node2 44071 1727204619.27086: ^ task is: TASK: Get stat for interface {{ interface }} 44071 1727204619.27090: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204619.27092: getting variables 44071 1727204619.27093: in VariableManager get_vars() 44071 1727204619.27104: Calling all_inventory to load vars for managed-node2 44071 1727204619.27106: Calling groups_inventory to load vars for managed-node2 44071 1727204619.27108: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204619.27115: Calling all_plugins_play to load vars for managed-node2 44071 1727204619.27117: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204619.27120: Calling groups_plugins_play to load vars for managed-node2 44071 1727204619.28644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204619.30812: done with get_vars() 44071 1727204619.30848: done getting variables 44071 1727204619.30993: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:03:39 -0400 (0:00:00.107) 0:00:31.626 ***** 44071 1727204619.31025: entering _queue_task() for managed-node2/stat 44071 1727204619.31416: worker is 1 (out of 1 available) 44071 1727204619.31429: exiting _queue_task() for managed-node2/stat 44071 1727204619.31444: done queuing things up, now waiting for results queue to drain 44071 1727204619.31446: waiting for pending results... 44071 1727204619.31763: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 44071 1727204619.31938: in run() - task 127b8e07-fff9-c964-7471-000000000928 44071 1727204619.31967: variable 'ansible_search_path' from source: unknown 44071 1727204619.31977: variable 'ansible_search_path' from source: unknown 44071 1727204619.32026: calling self._execute() 44071 1727204619.32137: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204619.32151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204619.32170: variable 'omit' from source: magic vars 44071 1727204619.32594: variable 'ansible_distribution_major_version' from source: facts 44071 1727204619.32614: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204619.32627: variable 'omit' from source: magic vars 44071 1727204619.32698: variable 'omit' from source: magic vars 44071 1727204619.32819: variable 'interface' from source: play vars 44071 1727204619.32846: variable 'omit' from source: magic vars 44071 1727204619.32900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204619.32949: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204619.32977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204619.33003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204619.33020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204619.33060: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204619.33073: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204619.33082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204619.33209: Set connection var ansible_connection to ssh 44071 1727204619.33222: Set connection var ansible_timeout to 10 44071 1727204619.33248: Set connection var ansible_pipelining to False 44071 1727204619.33251: Set connection var ansible_shell_type to sh 44071 1727204619.33358: Set connection var ansible_shell_executable to /bin/sh 44071 1727204619.33361: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204619.33364: variable 'ansible_shell_executable' from source: unknown 44071 1727204619.33369: variable 'ansible_connection' from source: unknown 44071 1727204619.33372: variable 'ansible_module_compression' from source: unknown 44071 1727204619.33374: variable 'ansible_shell_type' from source: unknown 44071 1727204619.33376: variable 'ansible_shell_executable' from source: unknown 44071 1727204619.33378: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204619.33381: variable 'ansible_pipelining' from source: unknown 44071 1727204619.33383: variable 'ansible_timeout' from source: unknown 44071 1727204619.33385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204619.33585: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204619.33603: variable 'omit' from source: magic vars 44071 1727204619.33613: starting attempt loop 44071 1727204619.33620: running the handler 44071 1727204619.33642: _low_level_execute_command(): starting 44071 1727204619.33656: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204619.34456: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204619.34489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204619.34569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204619.34625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204619.34650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204619.34681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204619.34789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204619.36581: stdout chunk (state=3): >>>/root <<< 44071 1727204619.36795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204619.36799: stdout chunk (state=3): >>><<< 44071 1727204619.36802: stderr chunk (state=3): >>><<< 44071 1727204619.36831: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204619.36902: _low_level_execute_command(): starting 44071 1727204619.36906: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204619.3684013-45870-60944344297834 `" && echo ansible-tmp-1727204619.3684013-45870-60944344297834="` echo /root/.ansible/tmp/ansible-tmp-1727204619.3684013-45870-60944344297834 `" ) && sleep 0' 44071 1727204619.37716: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204619.37759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204619.37877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204619.39903: stdout chunk (state=3): >>>ansible-tmp-1727204619.3684013-45870-60944344297834=/root/.ansible/tmp/ansible-tmp-1727204619.3684013-45870-60944344297834 <<< 44071 1727204619.40181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204619.40186: stdout chunk (state=3): >>><<< 44071 1727204619.40189: stderr chunk (state=3): >>><<< 44071 1727204619.40199: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204619.3684013-45870-60944344297834=/root/.ansible/tmp/ansible-tmp-1727204619.3684013-45870-60944344297834 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204619.40574: variable 'ansible_module_compression' from source: unknown 44071 1727204619.40578: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44071 1727204619.40613: variable 'ansible_facts' from source: unknown 44071 1727204619.40778: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204619.3684013-45870-60944344297834/AnsiballZ_stat.py 44071 1727204619.41216: Sending initial data 44071 1727204619.41220: Sent initial data (152 bytes) 44071 1727204619.41859: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204619.41972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204619.41976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204619.42022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204619.42050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204619.42074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204619.42225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204619.43810: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204619.43833: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 44071 1727204619.43854: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204619.43963: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204619.44056: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpd4vs6avc /root/.ansible/tmp/ansible-tmp-1727204619.3684013-45870-60944344297834/AnsiballZ_stat.py <<< 44071 1727204619.44059: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204619.3684013-45870-60944344297834/AnsiballZ_stat.py" <<< 44071 1727204619.44123: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpd4vs6avc" to remote "/root/.ansible/tmp/ansible-tmp-1727204619.3684013-45870-60944344297834/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204619.3684013-45870-60944344297834/AnsiballZ_stat.py" <<< 44071 1727204619.45419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204619.45424: stdout chunk (state=3): >>><<< 44071 1727204619.45426: stderr chunk (state=3): >>><<< 44071 1727204619.45428: done transferring module to remote 44071 1727204619.45430: _low_level_execute_command(): starting 44071 1727204619.45433: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204619.3684013-45870-60944344297834/ /root/.ansible/tmp/ansible-tmp-1727204619.3684013-45870-60944344297834/AnsiballZ_stat.py && sleep 0' 44071 1727204619.46991: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204619.47255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204619.47570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204619.47604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204619.49538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204619.49561: stderr chunk (state=3): >>><<< 44071 1727204619.49732: stdout chunk (state=3): >>><<< 44071 1727204619.49739: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204619.49747: _low_level_execute_command(): starting 44071 1727204619.49750: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204619.3684013-45870-60944344297834/AnsiballZ_stat.py && sleep 0' 44071 1727204619.50953: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204619.51241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204619.51282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204619.51383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204619.67927: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44071 1727204619.69318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204619.69405: stderr chunk (state=3): >>><<< 44071 1727204619.69409: stdout chunk (state=3): >>><<< 44071 1727204619.69597: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204619.69602: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204619.3684013-45870-60944344297834/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204619.69605: _low_level_execute_command(): starting 44071 1727204619.69607: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204619.3684013-45870-60944344297834/ > /dev/null 2>&1 && sleep 0' 44071 1727204619.70975: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204619.70995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204619.71107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204619.71124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204619.71227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204619.73392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204619.73531: stderr chunk (state=3): >>><<< 44071 1727204619.73542: stdout chunk (state=3): >>><<< 44071 1727204619.73672: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204619.73676: handler run complete 44071 1727204619.73678: attempt loop complete, returning result 44071 1727204619.73686: _execute() done 44071 1727204619.73694: dumping result to json 44071 1727204619.73873: done dumping result, returning 44071 1727204619.73876: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [127b8e07-fff9-c964-7471-000000000928] 44071 1727204619.73879: sending task result for task 127b8e07-fff9-c964-7471-000000000928 ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 44071 1727204619.74149: no more pending results, returning what we have 44071 1727204619.74153: results queue empty 44071 1727204619.74154: checking for any_errors_fatal 44071 1727204619.74156: done checking for any_errors_fatal 44071 1727204619.74157: checking for max_fail_percentage 44071 1727204619.74158: done checking for max_fail_percentage 44071 1727204619.74160: checking to see if all hosts have failed and the running result is not ok 44071 1727204619.74161: done checking to see if all hosts have failed 44071 1727204619.74161: getting the remaining hosts for this loop 44071 1727204619.74163: done getting the remaining hosts for this loop 44071 1727204619.74171: getting the next task for host managed-node2 44071 1727204619.74182: done getting next task for host managed-node2 44071 1727204619.74186: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 44071 1727204619.74190: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204619.74197: getting variables 44071 1727204619.74199: in VariableManager get_vars() 44071 1727204619.74233: Calling all_inventory to load vars for managed-node2 44071 1727204619.74235: Calling groups_inventory to load vars for managed-node2 44071 1727204619.74239: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204619.74252: Calling all_plugins_play to load vars for managed-node2 44071 1727204619.74255: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204619.74257: Calling groups_plugins_play to load vars for managed-node2 44071 1727204619.75176: done sending task result for task 127b8e07-fff9-c964-7471-000000000928 44071 1727204619.75181: WORKER PROCESS EXITING 44071 1727204619.78156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204619.83169: done with get_vars() 44071 1727204619.83208: done getting variables 44071 1727204619.83486: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204619.83619: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:03:39 -0400 (0:00:00.526) 0:00:32.153 ***** 44071 1727204619.83655: entering _queue_task() for managed-node2/assert 44071 1727204619.84461: worker is 1 (out of 1 available) 44071 1727204619.84477: exiting _queue_task() for managed-node2/assert 44071 1727204619.84492: done queuing things up, now waiting for results queue to drain 44071 1727204619.84494: waiting for pending results... 44071 1727204619.85093: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' 44071 1727204619.85416: in run() - task 127b8e07-fff9-c964-7471-0000000008a9 44071 1727204619.85490: variable 'ansible_search_path' from source: unknown 44071 1727204619.85495: variable 'ansible_search_path' from source: unknown 44071 1727204619.85651: calling self._execute() 44071 1727204619.85808: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204619.85815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204619.85862: variable 'omit' from source: magic vars 44071 1727204619.86721: variable 'ansible_distribution_major_version' from source: facts 44071 1727204619.86856: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204619.86862: variable 'omit' from source: magic vars 44071 1727204619.86915: variable 'omit' from source: magic vars 44071 1727204619.87166: variable 'interface' from source: play vars 44071 1727204619.87287: variable 'omit' from source: magic vars 44071 1727204619.87371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204619.87480: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204619.87508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204619.87527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204619.87541: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204619.87580: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204619.87584: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204619.87587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204619.88035: Set connection var ansible_connection to ssh 44071 1727204619.88039: Set connection var ansible_timeout to 10 44071 1727204619.88042: Set connection var ansible_pipelining to False 44071 1727204619.88044: Set connection var ansible_shell_type to sh 44071 1727204619.88046: Set connection var ansible_shell_executable to /bin/sh 44071 1727204619.88049: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204619.88051: variable 'ansible_shell_executable' from source: unknown 44071 1727204619.88053: variable 'ansible_connection' from source: unknown 44071 1727204619.88056: variable 'ansible_module_compression' from source: unknown 44071 1727204619.88058: variable 'ansible_shell_type' from source: unknown 44071 1727204619.88060: variable 'ansible_shell_executable' from source: unknown 44071 1727204619.88062: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204619.88064: variable 'ansible_pipelining' from source: unknown 44071 1727204619.88068: variable 'ansible_timeout' from source: unknown 44071 1727204619.88071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204619.88295: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204619.88305: variable 'omit' from source: magic vars 44071 1727204619.88311: starting attempt loop 44071 1727204619.88314: running the handler 44071 1727204619.88929: variable 'interface_stat' from source: set_fact 44071 1727204619.88939: Evaluated conditional (not interface_stat.stat.exists): True 44071 1727204619.88951: handler run complete 44071 1727204619.89021: attempt loop complete, returning result 44071 1727204619.89025: _execute() done 44071 1727204619.89027: dumping result to json 44071 1727204619.89030: done dumping result, returning 44071 1727204619.89071: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' [127b8e07-fff9-c964-7471-0000000008a9] 44071 1727204619.89075: sending task result for task 127b8e07-fff9-c964-7471-0000000008a9 44071 1727204619.89372: done sending task result for task 127b8e07-fff9-c964-7471-0000000008a9 44071 1727204619.89376: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204619.89434: no more pending results, returning what we have 44071 1727204619.89439: results queue empty 44071 1727204619.89440: checking for any_errors_fatal 44071 1727204619.89450: done checking for any_errors_fatal 44071 1727204619.89451: checking for max_fail_percentage 44071 1727204619.89452: done checking for max_fail_percentage 44071 1727204619.89453: checking to see if all hosts have failed and the running result is not ok 44071 1727204619.89454: done checking to see if all hosts have failed 44071 1727204619.89455: getting the remaining hosts for this loop 44071 1727204619.89456: done getting the remaining hosts for this loop 44071 1727204619.89462: getting the next task for host managed-node2 44071 1727204619.89475: done getting next task for host managed-node2 44071 1727204619.89478: ^ task is: TASK: Include the task 'get_profile_stat.yml' 44071 1727204619.89482: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204619.89487: getting variables 44071 1727204619.89488: in VariableManager get_vars() 44071 1727204619.89524: Calling all_inventory to load vars for managed-node2 44071 1727204619.89527: Calling groups_inventory to load vars for managed-node2 44071 1727204619.89531: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204619.89543: Calling all_plugins_play to load vars for managed-node2 44071 1727204619.89546: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204619.89549: Calling groups_plugins_play to load vars for managed-node2 44071 1727204619.93135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204619.97574: done with get_vars() 44071 1727204619.97620: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 15:03:39 -0400 (0:00:00.142) 0:00:32.295 ***** 44071 1727204619.97946: entering _queue_task() for managed-node2/include_tasks 44071 1727204619.98545: worker is 1 (out of 1 available) 44071 1727204619.98561: exiting _queue_task() for managed-node2/include_tasks 44071 1727204619.98778: done queuing things up, now waiting for results queue to drain 44071 1727204619.98781: waiting for pending results... 44071 1727204619.99196: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 44071 1727204619.99534: in run() - task 127b8e07-fff9-c964-7471-0000000008ad 44071 1727204619.99539: variable 'ansible_search_path' from source: unknown 44071 1727204619.99542: variable 'ansible_search_path' from source: unknown 44071 1727204619.99598: calling self._execute() 44071 1727204619.99821: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204619.99825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204619.99836: variable 'omit' from source: magic vars 44071 1727204620.00973: variable 'ansible_distribution_major_version' from source: facts 44071 1727204620.00978: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204620.00980: _execute() done 44071 1727204620.00983: dumping result to json 44071 1727204620.00986: done dumping result, returning 44071 1727204620.00988: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-c964-7471-0000000008ad] 44071 1727204620.00990: sending task result for task 127b8e07-fff9-c964-7471-0000000008ad 44071 1727204620.01074: done sending task result for task 127b8e07-fff9-c964-7471-0000000008ad 44071 1727204620.01077: WORKER PROCESS EXITING 44071 1727204620.01110: no more pending results, returning what we have 44071 1727204620.01116: in VariableManager get_vars() 44071 1727204620.01160: Calling all_inventory to load vars for managed-node2 44071 1727204620.01163: Calling groups_inventory to load vars for managed-node2 44071 1727204620.01168: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204620.01185: Calling all_plugins_play to load vars for managed-node2 44071 1727204620.01188: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204620.01190: Calling groups_plugins_play to load vars for managed-node2 44071 1727204620.05139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204620.08549: done with get_vars() 44071 1727204620.08581: variable 'ansible_search_path' from source: unknown 44071 1727204620.08582: variable 'ansible_search_path' from source: unknown 44071 1727204620.08592: variable 'item' from source: include params 44071 1727204620.08680: variable 'item' from source: include params 44071 1727204620.08707: we have included files to process 44071 1727204620.08708: generating all_blocks data 44071 1727204620.08709: done generating all_blocks data 44071 1727204620.08712: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204620.08714: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204620.08715: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204620.09393: done processing included file 44071 1727204620.09395: iterating over new_blocks loaded from include file 44071 1727204620.09396: in VariableManager get_vars() 44071 1727204620.09410: done with get_vars() 44071 1727204620.09412: filtering new block on tags 44071 1727204620.09462: done filtering new block on tags 44071 1727204620.09466: in VariableManager get_vars() 44071 1727204620.09477: done with get_vars() 44071 1727204620.09478: filtering new block on tags 44071 1727204620.09515: done filtering new block on tags 44071 1727204620.09516: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 44071 1727204620.09521: extending task lists for all hosts with included blocks 44071 1727204620.09858: done extending task lists 44071 1727204620.09859: done processing included files 44071 1727204620.09860: results queue empty 44071 1727204620.09861: checking for any_errors_fatal 44071 1727204620.09867: done checking for any_errors_fatal 44071 1727204620.09868: checking for max_fail_percentage 44071 1727204620.09869: done checking for max_fail_percentage 44071 1727204620.09870: checking to see if all hosts have failed and the running result is not ok 44071 1727204620.09871: done checking to see if all hosts have failed 44071 1727204620.09871: getting the remaining hosts for this loop 44071 1727204620.09873: done getting the remaining hosts for this loop 44071 1727204620.09876: getting the next task for host managed-node2 44071 1727204620.09881: done getting next task for host managed-node2 44071 1727204620.09883: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 44071 1727204620.09887: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204620.09890: getting variables 44071 1727204620.09891: in VariableManager get_vars() 44071 1727204620.09902: Calling all_inventory to load vars for managed-node2 44071 1727204620.09905: Calling groups_inventory to load vars for managed-node2 44071 1727204620.09907: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204620.09915: Calling all_plugins_play to load vars for managed-node2 44071 1727204620.09917: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204620.09921: Calling groups_plugins_play to load vars for managed-node2 44071 1727204620.11924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204620.13119: done with get_vars() 44071 1727204620.13147: done getting variables 44071 1727204620.13190: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:03:40 -0400 (0:00:00.152) 0:00:32.448 ***** 44071 1727204620.13216: entering _queue_task() for managed-node2/set_fact 44071 1727204620.13525: worker is 1 (out of 1 available) 44071 1727204620.13540: exiting _queue_task() for managed-node2/set_fact 44071 1727204620.13555: done queuing things up, now waiting for results queue to drain 44071 1727204620.13557: waiting for pending results... 44071 1727204620.13863: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 44071 1727204620.14046: in run() - task 127b8e07-fff9-c964-7471-000000000946 44071 1727204620.14089: variable 'ansible_search_path' from source: unknown 44071 1727204620.14126: variable 'ansible_search_path' from source: unknown 44071 1727204620.14180: calling self._execute() 44071 1727204620.14289: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204620.14302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204620.14372: variable 'omit' from source: magic vars 44071 1727204620.14781: variable 'ansible_distribution_major_version' from source: facts 44071 1727204620.14802: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204620.14814: variable 'omit' from source: magic vars 44071 1727204620.14886: variable 'omit' from source: magic vars 44071 1727204620.14936: variable 'omit' from source: magic vars 44071 1727204620.14991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204620.15042: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204620.15078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204620.15105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204620.15123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204620.15160: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204620.15182: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204620.15187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204620.15400: Set connection var ansible_connection to ssh 44071 1727204620.15404: Set connection var ansible_timeout to 10 44071 1727204620.15407: Set connection var ansible_pipelining to False 44071 1727204620.15409: Set connection var ansible_shell_type to sh 44071 1727204620.15411: Set connection var ansible_shell_executable to /bin/sh 44071 1727204620.15413: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204620.15415: variable 'ansible_shell_executable' from source: unknown 44071 1727204620.15418: variable 'ansible_connection' from source: unknown 44071 1727204620.15420: variable 'ansible_module_compression' from source: unknown 44071 1727204620.15422: variable 'ansible_shell_type' from source: unknown 44071 1727204620.15425: variable 'ansible_shell_executable' from source: unknown 44071 1727204620.15427: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204620.15429: variable 'ansible_pipelining' from source: unknown 44071 1727204620.15431: variable 'ansible_timeout' from source: unknown 44071 1727204620.15433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204620.15598: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204620.15625: variable 'omit' from source: magic vars 44071 1727204620.15635: starting attempt loop 44071 1727204620.15641: running the handler 44071 1727204620.15663: handler run complete 44071 1727204620.15682: attempt loop complete, returning result 44071 1727204620.15689: _execute() done 44071 1727204620.15696: dumping result to json 44071 1727204620.15703: done dumping result, returning 44071 1727204620.15713: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-c964-7471-000000000946] 44071 1727204620.15728: sending task result for task 127b8e07-fff9-c964-7471-000000000946 ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 44071 1727204620.16052: no more pending results, returning what we have 44071 1727204620.16056: results queue empty 44071 1727204620.16058: checking for any_errors_fatal 44071 1727204620.16060: done checking for any_errors_fatal 44071 1727204620.16061: checking for max_fail_percentage 44071 1727204620.16063: done checking for max_fail_percentage 44071 1727204620.16064: checking to see if all hosts have failed and the running result is not ok 44071 1727204620.16066: done checking to see if all hosts have failed 44071 1727204620.16070: getting the remaining hosts for this loop 44071 1727204620.16071: done getting the remaining hosts for this loop 44071 1727204620.16078: getting the next task for host managed-node2 44071 1727204620.16088: done getting next task for host managed-node2 44071 1727204620.16092: ^ task is: TASK: Stat profile file 44071 1727204620.16098: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204620.16103: getting variables 44071 1727204620.16104: in VariableManager get_vars() 44071 1727204620.16142: Calling all_inventory to load vars for managed-node2 44071 1727204620.16146: Calling groups_inventory to load vars for managed-node2 44071 1727204620.16150: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204620.16286: Calling all_plugins_play to load vars for managed-node2 44071 1727204620.16290: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204620.16294: Calling groups_plugins_play to load vars for managed-node2 44071 1727204620.17086: done sending task result for task 127b8e07-fff9-c964-7471-000000000946 44071 1727204620.17091: WORKER PROCESS EXITING 44071 1727204620.18232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204620.21454: done with get_vars() 44071 1727204620.21496: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:03:40 -0400 (0:00:00.085) 0:00:32.534 ***** 44071 1727204620.21805: entering _queue_task() for managed-node2/stat 44071 1727204620.22589: worker is 1 (out of 1 available) 44071 1727204620.22608: exiting _queue_task() for managed-node2/stat 44071 1727204620.22623: done queuing things up, now waiting for results queue to drain 44071 1727204620.22625: waiting for pending results... 44071 1727204620.22878: running TaskExecutor() for managed-node2/TASK: Stat profile file 44071 1727204620.23017: in run() - task 127b8e07-fff9-c964-7471-000000000947 44071 1727204620.23033: variable 'ansible_search_path' from source: unknown 44071 1727204620.23038: variable 'ansible_search_path' from source: unknown 44071 1727204620.23081: calling self._execute() 44071 1727204620.23191: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204620.23198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204620.23209: variable 'omit' from source: magic vars 44071 1727204620.23681: variable 'ansible_distribution_major_version' from source: facts 44071 1727204620.23694: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204620.23700: variable 'omit' from source: magic vars 44071 1727204620.23776: variable 'omit' from source: magic vars 44071 1727204620.23895: variable 'profile' from source: play vars 44071 1727204620.23899: variable 'interface' from source: play vars 44071 1727204620.23971: variable 'interface' from source: play vars 44071 1727204620.24000: variable 'omit' from source: magic vars 44071 1727204620.24054: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204620.24111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204620.24133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204620.24155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204620.24169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204620.24212: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204620.24215: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204620.24218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204620.24416: Set connection var ansible_connection to ssh 44071 1727204620.24419: Set connection var ansible_timeout to 10 44071 1727204620.24422: Set connection var ansible_pipelining to False 44071 1727204620.24424: Set connection var ansible_shell_type to sh 44071 1727204620.24427: Set connection var ansible_shell_executable to /bin/sh 44071 1727204620.24430: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204620.24432: variable 'ansible_shell_executable' from source: unknown 44071 1727204620.24434: variable 'ansible_connection' from source: unknown 44071 1727204620.24437: variable 'ansible_module_compression' from source: unknown 44071 1727204620.24439: variable 'ansible_shell_type' from source: unknown 44071 1727204620.24441: variable 'ansible_shell_executable' from source: unknown 44071 1727204620.24443: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204620.24448: variable 'ansible_pipelining' from source: unknown 44071 1727204620.24451: variable 'ansible_timeout' from source: unknown 44071 1727204620.24455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204620.24701: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204620.24768: variable 'omit' from source: magic vars 44071 1727204620.24772: starting attempt loop 44071 1727204620.24775: running the handler 44071 1727204620.24778: _low_level_execute_command(): starting 44071 1727204620.24780: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204620.26174: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204620.26203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204620.26262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204620.26279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204620.26308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204620.26403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204620.28203: stdout chunk (state=3): >>>/root <<< 44071 1727204620.28429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204620.28436: stdout chunk (state=3): >>><<< 44071 1727204620.28451: stderr chunk (state=3): >>><<< 44071 1727204620.28590: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204620.28595: _low_level_execute_command(): starting 44071 1727204620.28599: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204620.2848344-45897-122063561732598 `" && echo ansible-tmp-1727204620.2848344-45897-122063561732598="` echo /root/.ansible/tmp/ansible-tmp-1727204620.2848344-45897-122063561732598 `" ) && sleep 0' 44071 1727204620.29842: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204620.29927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204620.29939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204620.30031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204620.30035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204620.30259: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204620.30348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204620.30528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204620.32752: stdout chunk (state=3): >>>ansible-tmp-1727204620.2848344-45897-122063561732598=/root/.ansible/tmp/ansible-tmp-1727204620.2848344-45897-122063561732598 <<< 44071 1727204620.32814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204620.32916: stderr chunk (state=3): >>><<< 44071 1727204620.33014: stdout chunk (state=3): >>><<< 44071 1727204620.33023: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204620.2848344-45897-122063561732598=/root/.ansible/tmp/ansible-tmp-1727204620.2848344-45897-122063561732598 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204620.33083: variable 'ansible_module_compression' from source: unknown 44071 1727204620.33310: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44071 1727204620.33383: variable 'ansible_facts' from source: unknown 44071 1727204620.33673: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204620.2848344-45897-122063561732598/AnsiballZ_stat.py 44071 1727204620.33892: Sending initial data 44071 1727204620.33895: Sent initial data (153 bytes) 44071 1727204620.34687: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204620.34716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204620.34727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204620.34751: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204620.34862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204620.36601: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204620.36889: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204620.36973: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpp6qktsst /root/.ansible/tmp/ansible-tmp-1727204620.2848344-45897-122063561732598/AnsiballZ_stat.py <<< 44071 1727204620.36977: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204620.2848344-45897-122063561732598/AnsiballZ_stat.py" <<< 44071 1727204620.37101: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpp6qktsst" to remote "/root/.ansible/tmp/ansible-tmp-1727204620.2848344-45897-122063561732598/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204620.2848344-45897-122063561732598/AnsiballZ_stat.py" <<< 44071 1727204620.38354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204620.38511: stderr chunk (state=3): >>><<< 44071 1727204620.38523: stdout chunk (state=3): >>><<< 44071 1727204620.38575: done transferring module to remote 44071 1727204620.38595: _low_level_execute_command(): starting 44071 1727204620.38606: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204620.2848344-45897-122063561732598/ /root/.ansible/tmp/ansible-tmp-1727204620.2848344-45897-122063561732598/AnsiballZ_stat.py && sleep 0' 44071 1727204620.39339: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204620.39451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204620.39485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204620.39510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204620.39577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204620.39677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204620.41742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204620.41747: stdout chunk (state=3): >>><<< 44071 1727204620.41750: stderr chunk (state=3): >>><<< 44071 1727204620.41773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204620.41783: _low_level_execute_command(): starting 44071 1727204620.41878: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204620.2848344-45897-122063561732598/AnsiballZ_stat.py && sleep 0' 44071 1727204620.42644: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204620.42674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204620.42711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204620.42825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204620.59974: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44071 1727204620.61693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204620.61697: stderr chunk (state=3): >>><<< 44071 1727204620.61700: stdout chunk (state=3): >>><<< 44071 1727204620.61703: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204620.61705: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204620.2848344-45897-122063561732598/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204620.61707: _low_level_execute_command(): starting 44071 1727204620.61710: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204620.2848344-45897-122063561732598/ > /dev/null 2>&1 && sleep 0' 44071 1727204620.62495: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204620.62499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204620.62575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204620.62580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204620.62585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204620.62716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204620.64903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204620.64913: stdout chunk (state=3): >>><<< 44071 1727204620.64916: stderr chunk (state=3): >>><<< 44071 1727204620.64968: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204620.64975: handler run complete 44071 1727204620.65010: attempt loop complete, returning result 44071 1727204620.65013: _execute() done 44071 1727204620.65016: dumping result to json 44071 1727204620.65018: done dumping result, returning 44071 1727204620.65020: done running TaskExecutor() for managed-node2/TASK: Stat profile file [127b8e07-fff9-c964-7471-000000000947] 44071 1727204620.65022: sending task result for task 127b8e07-fff9-c964-7471-000000000947 44071 1727204620.65200: done sending task result for task 127b8e07-fff9-c964-7471-000000000947 44071 1727204620.65203: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 44071 1727204620.65288: no more pending results, returning what we have 44071 1727204620.65292: results queue empty 44071 1727204620.65293: checking for any_errors_fatal 44071 1727204620.65300: done checking for any_errors_fatal 44071 1727204620.65301: checking for max_fail_percentage 44071 1727204620.65302: done checking for max_fail_percentage 44071 1727204620.65303: checking to see if all hosts have failed and the running result is not ok 44071 1727204620.65304: done checking to see if all hosts have failed 44071 1727204620.65305: getting the remaining hosts for this loop 44071 1727204620.65306: done getting the remaining hosts for this loop 44071 1727204620.65311: getting the next task for host managed-node2 44071 1727204620.65320: done getting next task for host managed-node2 44071 1727204620.65323: ^ task is: TASK: Set NM profile exist flag based on the profile files 44071 1727204620.65390: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204620.65397: getting variables 44071 1727204620.65399: in VariableManager get_vars() 44071 1727204620.65429: Calling all_inventory to load vars for managed-node2 44071 1727204620.65432: Calling groups_inventory to load vars for managed-node2 44071 1727204620.65435: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204620.65671: Calling all_plugins_play to load vars for managed-node2 44071 1727204620.65676: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204620.65680: Calling groups_plugins_play to load vars for managed-node2 44071 1727204620.67790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204620.70794: done with get_vars() 44071 1727204620.71002: done getting variables 44071 1727204620.71350: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:03:40 -0400 (0:00:00.496) 0:00:33.030 ***** 44071 1727204620.71433: entering _queue_task() for managed-node2/set_fact 44071 1727204620.72874: worker is 1 (out of 1 available) 44071 1727204620.72895: exiting _queue_task() for managed-node2/set_fact 44071 1727204620.72914: done queuing things up, now waiting for results queue to drain 44071 1727204620.72916: waiting for pending results... 44071 1727204620.74209: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 44071 1727204620.74291: in run() - task 127b8e07-fff9-c964-7471-000000000948 44071 1727204620.74296: variable 'ansible_search_path' from source: unknown 44071 1727204620.74299: variable 'ansible_search_path' from source: unknown 44071 1727204620.74303: calling self._execute() 44071 1727204620.74393: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204620.74399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204620.74455: variable 'omit' from source: magic vars 44071 1727204620.75452: variable 'ansible_distribution_major_version' from source: facts 44071 1727204620.75457: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204620.75643: variable 'profile_stat' from source: set_fact 44071 1727204620.75648: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204620.75651: when evaluation is False, skipping this task 44071 1727204620.75871: _execute() done 44071 1727204620.75875: dumping result to json 44071 1727204620.75877: done dumping result, returning 44071 1727204620.75880: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-c964-7471-000000000948] 44071 1727204620.75883: sending task result for task 127b8e07-fff9-c964-7471-000000000948 44071 1727204620.75960: done sending task result for task 127b8e07-fff9-c964-7471-000000000948 44071 1727204620.75964: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204620.76015: no more pending results, returning what we have 44071 1727204620.76019: results queue empty 44071 1727204620.76020: checking for any_errors_fatal 44071 1727204620.76030: done checking for any_errors_fatal 44071 1727204620.76031: checking for max_fail_percentage 44071 1727204620.76032: done checking for max_fail_percentage 44071 1727204620.76033: checking to see if all hosts have failed and the running result is not ok 44071 1727204620.76034: done checking to see if all hosts have failed 44071 1727204620.76035: getting the remaining hosts for this loop 44071 1727204620.76036: done getting the remaining hosts for this loop 44071 1727204620.76043: getting the next task for host managed-node2 44071 1727204620.76051: done getting next task for host managed-node2 44071 1727204620.76054: ^ task is: TASK: Get NM profile info 44071 1727204620.76060: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204620.76064: getting variables 44071 1727204620.76181: in VariableManager get_vars() 44071 1727204620.76218: Calling all_inventory to load vars for managed-node2 44071 1727204620.76221: Calling groups_inventory to load vars for managed-node2 44071 1727204620.76225: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204620.76238: Calling all_plugins_play to load vars for managed-node2 44071 1727204620.76245: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204620.76249: Calling groups_plugins_play to load vars for managed-node2 44071 1727204620.80726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204620.85248: done with get_vars() 44071 1727204620.85351: done getting variables 44071 1727204620.85464: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:03:40 -0400 (0:00:00.140) 0:00:33.171 ***** 44071 1727204620.85510: entering _queue_task() for managed-node2/shell 44071 1727204620.86639: worker is 1 (out of 1 available) 44071 1727204620.86653: exiting _queue_task() for managed-node2/shell 44071 1727204620.86799: done queuing things up, now waiting for results queue to drain 44071 1727204620.86801: waiting for pending results... 44071 1727204620.87122: running TaskExecutor() for managed-node2/TASK: Get NM profile info 44071 1727204620.87875: in run() - task 127b8e07-fff9-c964-7471-000000000949 44071 1727204620.87880: variable 'ansible_search_path' from source: unknown 44071 1727204620.87883: variable 'ansible_search_path' from source: unknown 44071 1727204620.87886: calling self._execute() 44071 1727204620.87901: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204620.87992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204620.88013: variable 'omit' from source: magic vars 44071 1727204620.89385: variable 'ansible_distribution_major_version' from source: facts 44071 1727204620.89416: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204620.89434: variable 'omit' from source: magic vars 44071 1727204620.89727: variable 'omit' from source: magic vars 44071 1727204620.89901: variable 'profile' from source: play vars 44071 1727204620.90192: variable 'interface' from source: play vars 44071 1727204620.90195: variable 'interface' from source: play vars 44071 1727204620.90257: variable 'omit' from source: magic vars 44071 1727204620.90399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204620.90523: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204620.90685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204620.90724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204620.90772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204620.90826: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204620.91182: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204620.91185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204620.91371: Set connection var ansible_connection to ssh 44071 1727204620.91374: Set connection var ansible_timeout to 10 44071 1727204620.91377: Set connection var ansible_pipelining to False 44071 1727204620.91379: Set connection var ansible_shell_type to sh 44071 1727204620.91381: Set connection var ansible_shell_executable to /bin/sh 44071 1727204620.91384: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204620.91386: variable 'ansible_shell_executable' from source: unknown 44071 1727204620.91388: variable 'ansible_connection' from source: unknown 44071 1727204620.91391: variable 'ansible_module_compression' from source: unknown 44071 1727204620.91393: variable 'ansible_shell_type' from source: unknown 44071 1727204620.91395: variable 'ansible_shell_executable' from source: unknown 44071 1727204620.91397: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204620.91399: variable 'ansible_pipelining' from source: unknown 44071 1727204620.91401: variable 'ansible_timeout' from source: unknown 44071 1727204620.91403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204620.91681: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204620.91972: variable 'omit' from source: magic vars 44071 1727204620.91975: starting attempt loop 44071 1727204620.91978: running the handler 44071 1727204620.91981: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204620.91984: _low_level_execute_command(): starting 44071 1727204620.91986: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204620.93501: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204620.93753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204620.93883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204620.93920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204620.93947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204620.94060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204620.95836: stdout chunk (state=3): >>>/root <<< 44071 1727204620.96168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204620.96184: stdout chunk (state=3): >>><<< 44071 1727204620.96198: stderr chunk (state=3): >>><<< 44071 1727204620.96243: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204620.96281: _low_level_execute_command(): starting 44071 1727204620.96295: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204620.9626274-45993-142521473184353 `" && echo ansible-tmp-1727204620.9626274-45993-142521473184353="` echo /root/.ansible/tmp/ansible-tmp-1727204620.9626274-45993-142521473184353 `" ) && sleep 0' 44071 1727204620.97039: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204620.97072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204620.97092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204620.97114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204620.97136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204620.97189: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204620.97250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204620.97275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204620.97303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204620.97416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204620.99502: stdout chunk (state=3): >>>ansible-tmp-1727204620.9626274-45993-142521473184353=/root/.ansible/tmp/ansible-tmp-1727204620.9626274-45993-142521473184353 <<< 44071 1727204620.99704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204620.99771: stderr chunk (state=3): >>><<< 44071 1727204620.99815: stdout chunk (state=3): >>><<< 44071 1727204620.99911: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204620.9626274-45993-142521473184353=/root/.ansible/tmp/ansible-tmp-1727204620.9626274-45993-142521473184353 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204621.00095: variable 'ansible_module_compression' from source: unknown 44071 1727204621.00099: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204621.00102: variable 'ansible_facts' from source: unknown 44071 1727204621.00361: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204620.9626274-45993-142521473184353/AnsiballZ_command.py 44071 1727204621.00582: Sending initial data 44071 1727204621.00592: Sent initial data (156 bytes) 44071 1727204621.01510: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204621.01664: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204621.01737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204621.01828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204621.01871: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204621.01954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204621.03722: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204621.04270: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204621.04276: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmprfu_wv2m /root/.ansible/tmp/ansible-tmp-1727204620.9626274-45993-142521473184353/AnsiballZ_command.py <<< 44071 1727204621.04279: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204620.9626274-45993-142521473184353/AnsiballZ_command.py" <<< 44071 1727204621.04282: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmprfu_wv2m" to remote "/root/.ansible/tmp/ansible-tmp-1727204620.9626274-45993-142521473184353/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204620.9626274-45993-142521473184353/AnsiballZ_command.py" <<< 44071 1727204621.06402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204621.06509: stderr chunk (state=3): >>><<< 44071 1727204621.06512: stdout chunk (state=3): >>><<< 44071 1727204621.06538: done transferring module to remote 44071 1727204621.06551: _low_level_execute_command(): starting 44071 1727204621.06556: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204620.9626274-45993-142521473184353/ /root/.ansible/tmp/ansible-tmp-1727204620.9626274-45993-142521473184353/AnsiballZ_command.py && sleep 0' 44071 1727204621.07949: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204621.07959: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204621.08139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204621.08421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204621.08426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204621.10449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204621.10454: stdout chunk (state=3): >>><<< 44071 1727204621.10463: stderr chunk (state=3): >>><<< 44071 1727204621.10563: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204621.10571: _low_level_execute_command(): starting 44071 1727204621.10575: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204620.9626274-45993-142521473184353/AnsiballZ_command.py && sleep 0' 44071 1727204621.11634: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204621.11642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204621.11669: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204621.11673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204621.11750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204621.11795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204621.11875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204621.30350: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:03:41.283878", "end": "2024-09-24 15:03:41.302002", "delta": "0:00:00.018124", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204621.31951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204621.32017: stderr chunk (state=3): >>><<< 44071 1727204621.32021: stdout chunk (state=3): >>><<< 44071 1727204621.32038: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:03:41.283878", "end": "2024-09-24 15:03:41.302002", "delta": "0:00:00.018124", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204621.32071: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204620.9626274-45993-142521473184353/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204621.32082: _low_level_execute_command(): starting 44071 1727204621.32087: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204620.9626274-45993-142521473184353/ > /dev/null 2>&1 && sleep 0' 44071 1727204621.32561: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204621.32567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204621.32597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204621.32601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204621.32608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204621.32610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204621.32669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204621.32672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204621.32679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204621.32750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204621.34664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204621.34725: stderr chunk (state=3): >>><<< 44071 1727204621.34728: stdout chunk (state=3): >>><<< 44071 1727204621.34746: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204621.34752: handler run complete 44071 1727204621.34777: Evaluated conditional (False): False 44071 1727204621.34788: attempt loop complete, returning result 44071 1727204621.34791: _execute() done 44071 1727204621.34797: dumping result to json 44071 1727204621.34802: done dumping result, returning 44071 1727204621.34810: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [127b8e07-fff9-c964-7471-000000000949] 44071 1727204621.34815: sending task result for task 127b8e07-fff9-c964-7471-000000000949 44071 1727204621.34924: done sending task result for task 127b8e07-fff9-c964-7471-000000000949 44071 1727204621.34927: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.018124", "end": "2024-09-24 15:03:41.302002", "rc": 0, "start": "2024-09-24 15:03:41.283878" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 44071 1727204621.35007: no more pending results, returning what we have 44071 1727204621.35012: results queue empty 44071 1727204621.35012: checking for any_errors_fatal 44071 1727204621.35020: done checking for any_errors_fatal 44071 1727204621.35021: checking for max_fail_percentage 44071 1727204621.35023: done checking for max_fail_percentage 44071 1727204621.35024: checking to see if all hosts have failed and the running result is not ok 44071 1727204621.35025: done checking to see if all hosts have failed 44071 1727204621.35026: getting the remaining hosts for this loop 44071 1727204621.35027: done getting the remaining hosts for this loop 44071 1727204621.35032: getting the next task for host managed-node2 44071 1727204621.35043: done getting next task for host managed-node2 44071 1727204621.35046: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44071 1727204621.35050: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204621.35054: getting variables 44071 1727204621.35055: in VariableManager get_vars() 44071 1727204621.35089: Calling all_inventory to load vars for managed-node2 44071 1727204621.35091: Calling groups_inventory to load vars for managed-node2 44071 1727204621.35095: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204621.35107: Calling all_plugins_play to load vars for managed-node2 44071 1727204621.35110: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204621.35113: Calling groups_plugins_play to load vars for managed-node2 44071 1727204621.36191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204621.37685: done with get_vars() 44071 1727204621.37723: done getting variables 44071 1727204621.37791: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.523) 0:00:33.694 ***** 44071 1727204621.37826: entering _queue_task() for managed-node2/set_fact 44071 1727204621.38206: worker is 1 (out of 1 available) 44071 1727204621.38219: exiting _queue_task() for managed-node2/set_fact 44071 1727204621.38232: done queuing things up, now waiting for results queue to drain 44071 1727204621.38234: waiting for pending results... 44071 1727204621.38690: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44071 1727204621.38707: in run() - task 127b8e07-fff9-c964-7471-00000000094a 44071 1727204621.38730: variable 'ansible_search_path' from source: unknown 44071 1727204621.38743: variable 'ansible_search_path' from source: unknown 44071 1727204621.38792: calling self._execute() 44071 1727204621.38897: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204621.38909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204621.38922: variable 'omit' from source: magic vars 44071 1727204621.39263: variable 'ansible_distribution_major_version' from source: facts 44071 1727204621.39276: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204621.39381: variable 'nm_profile_exists' from source: set_fact 44071 1727204621.39392: Evaluated conditional (nm_profile_exists.rc == 0): True 44071 1727204621.39398: variable 'omit' from source: magic vars 44071 1727204621.39438: variable 'omit' from source: magic vars 44071 1727204621.39469: variable 'omit' from source: magic vars 44071 1727204621.39506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204621.39536: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204621.39558: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204621.39577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204621.39588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204621.39614: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204621.39617: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204621.39620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204621.39705: Set connection var ansible_connection to ssh 44071 1727204621.39711: Set connection var ansible_timeout to 10 44071 1727204621.39717: Set connection var ansible_pipelining to False 44071 1727204621.39722: Set connection var ansible_shell_type to sh 44071 1727204621.39728: Set connection var ansible_shell_executable to /bin/sh 44071 1727204621.39734: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204621.39758: variable 'ansible_shell_executable' from source: unknown 44071 1727204621.39761: variable 'ansible_connection' from source: unknown 44071 1727204621.39763: variable 'ansible_module_compression' from source: unknown 44071 1727204621.39768: variable 'ansible_shell_type' from source: unknown 44071 1727204621.39770: variable 'ansible_shell_executable' from source: unknown 44071 1727204621.39772: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204621.39775: variable 'ansible_pipelining' from source: unknown 44071 1727204621.39779: variable 'ansible_timeout' from source: unknown 44071 1727204621.39782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204621.39902: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204621.39914: variable 'omit' from source: magic vars 44071 1727204621.39919: starting attempt loop 44071 1727204621.39922: running the handler 44071 1727204621.39933: handler run complete 44071 1727204621.39945: attempt loop complete, returning result 44071 1727204621.39948: _execute() done 44071 1727204621.39951: dumping result to json 44071 1727204621.39955: done dumping result, returning 44071 1727204621.39963: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-c964-7471-00000000094a] 44071 1727204621.39969: sending task result for task 127b8e07-fff9-c964-7471-00000000094a ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 44071 1727204621.40127: no more pending results, returning what we have 44071 1727204621.40131: results queue empty 44071 1727204621.40132: checking for any_errors_fatal 44071 1727204621.40141: done checking for any_errors_fatal 44071 1727204621.40142: checking for max_fail_percentage 44071 1727204621.40143: done checking for max_fail_percentage 44071 1727204621.40144: checking to see if all hosts have failed and the running result is not ok 44071 1727204621.40145: done checking to see if all hosts have failed 44071 1727204621.40146: getting the remaining hosts for this loop 44071 1727204621.40147: done getting the remaining hosts for this loop 44071 1727204621.40152: getting the next task for host managed-node2 44071 1727204621.40163: done getting next task for host managed-node2 44071 1727204621.40168: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 44071 1727204621.40173: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204621.40178: getting variables 44071 1727204621.40179: in VariableManager get_vars() 44071 1727204621.40214: Calling all_inventory to load vars for managed-node2 44071 1727204621.40217: Calling groups_inventory to load vars for managed-node2 44071 1727204621.40220: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204621.40232: Calling all_plugins_play to load vars for managed-node2 44071 1727204621.40235: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204621.40237: Calling groups_plugins_play to load vars for managed-node2 44071 1727204621.40923: done sending task result for task 127b8e07-fff9-c964-7471-00000000094a 44071 1727204621.40929: WORKER PROCESS EXITING 44071 1727204621.41450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204621.42641: done with get_vars() 44071 1727204621.42674: done getting variables 44071 1727204621.42725: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204621.42829: variable 'profile' from source: play vars 44071 1727204621.42833: variable 'interface' from source: play vars 44071 1727204621.42880: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.050) 0:00:33.745 ***** 44071 1727204621.42908: entering _queue_task() for managed-node2/command 44071 1727204621.43248: worker is 1 (out of 1 available) 44071 1727204621.43269: exiting _queue_task() for managed-node2/command 44071 1727204621.43284: done queuing things up, now waiting for results queue to drain 44071 1727204621.43286: waiting for pending results... 44071 1727204621.43508: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr 44071 1727204621.43603: in run() - task 127b8e07-fff9-c964-7471-00000000094c 44071 1727204621.43618: variable 'ansible_search_path' from source: unknown 44071 1727204621.43624: variable 'ansible_search_path' from source: unknown 44071 1727204621.43658: calling self._execute() 44071 1727204621.43735: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204621.43741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204621.43752: variable 'omit' from source: magic vars 44071 1727204621.44308: variable 'ansible_distribution_major_version' from source: facts 44071 1727204621.44320: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204621.44415: variable 'profile_stat' from source: set_fact 44071 1727204621.44428: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204621.44432: when evaluation is False, skipping this task 44071 1727204621.44435: _execute() done 44071 1727204621.44438: dumping result to json 44071 1727204621.44443: done dumping result, returning 44071 1727204621.44449: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr [127b8e07-fff9-c964-7471-00000000094c] 44071 1727204621.44454: sending task result for task 127b8e07-fff9-c964-7471-00000000094c 44071 1727204621.44562: done sending task result for task 127b8e07-fff9-c964-7471-00000000094c 44071 1727204621.44567: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204621.44630: no more pending results, returning what we have 44071 1727204621.44635: results queue empty 44071 1727204621.44636: checking for any_errors_fatal 44071 1727204621.44643: done checking for any_errors_fatal 44071 1727204621.44644: checking for max_fail_percentage 44071 1727204621.44645: done checking for max_fail_percentage 44071 1727204621.44647: checking to see if all hosts have failed and the running result is not ok 44071 1727204621.44648: done checking to see if all hosts have failed 44071 1727204621.44648: getting the remaining hosts for this loop 44071 1727204621.44650: done getting the remaining hosts for this loop 44071 1727204621.44655: getting the next task for host managed-node2 44071 1727204621.44665: done getting next task for host managed-node2 44071 1727204621.44669: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 44071 1727204621.44677: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204621.44682: getting variables 44071 1727204621.44686: in VariableManager get_vars() 44071 1727204621.44730: Calling all_inventory to load vars for managed-node2 44071 1727204621.44735: Calling groups_inventory to load vars for managed-node2 44071 1727204621.44742: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204621.44799: Calling all_plugins_play to load vars for managed-node2 44071 1727204621.44805: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204621.44809: Calling groups_plugins_play to load vars for managed-node2 44071 1727204621.46716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204621.57547: done with get_vars() 44071 1727204621.57805: done getting variables 44071 1727204621.58074: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204621.58199: variable 'profile' from source: play vars 44071 1727204621.58204: variable 'interface' from source: play vars 44071 1727204621.58276: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.153) 0:00:33.899 ***** 44071 1727204621.58309: entering _queue_task() for managed-node2/set_fact 44071 1727204621.59125: worker is 1 (out of 1 available) 44071 1727204621.59142: exiting _queue_task() for managed-node2/set_fact 44071 1727204621.59156: done queuing things up, now waiting for results queue to drain 44071 1727204621.59158: waiting for pending results... 44071 1727204621.60187: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 44071 1727204621.60246: in run() - task 127b8e07-fff9-c964-7471-00000000094d 44071 1727204621.60251: variable 'ansible_search_path' from source: unknown 44071 1727204621.60277: variable 'ansible_search_path' from source: unknown 44071 1727204621.60472: calling self._execute() 44071 1727204621.60659: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204621.60663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204621.60669: variable 'omit' from source: magic vars 44071 1727204621.61693: variable 'ansible_distribution_major_version' from source: facts 44071 1727204621.61776: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204621.61876: variable 'profile_stat' from source: set_fact 44071 1727204621.62002: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204621.62007: when evaluation is False, skipping this task 44071 1727204621.62010: _execute() done 44071 1727204621.62013: dumping result to json 44071 1727204621.62016: done dumping result, returning 44071 1727204621.62024: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [127b8e07-fff9-c964-7471-00000000094d] 44071 1727204621.62027: sending task result for task 127b8e07-fff9-c964-7471-00000000094d 44071 1727204621.62149: done sending task result for task 127b8e07-fff9-c964-7471-00000000094d 44071 1727204621.62154: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204621.62211: no more pending results, returning what we have 44071 1727204621.62216: results queue empty 44071 1727204621.62217: checking for any_errors_fatal 44071 1727204621.62229: done checking for any_errors_fatal 44071 1727204621.62229: checking for max_fail_percentage 44071 1727204621.62231: done checking for max_fail_percentage 44071 1727204621.62232: checking to see if all hosts have failed and the running result is not ok 44071 1727204621.62233: done checking to see if all hosts have failed 44071 1727204621.62234: getting the remaining hosts for this loop 44071 1727204621.62236: done getting the remaining hosts for this loop 44071 1727204621.62244: getting the next task for host managed-node2 44071 1727204621.62254: done getting next task for host managed-node2 44071 1727204621.62257: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 44071 1727204621.62264: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204621.62271: getting variables 44071 1727204621.62273: in VariableManager get_vars() 44071 1727204621.62310: Calling all_inventory to load vars for managed-node2 44071 1727204621.62313: Calling groups_inventory to load vars for managed-node2 44071 1727204621.62317: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204621.62334: Calling all_plugins_play to load vars for managed-node2 44071 1727204621.62338: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204621.62344: Calling groups_plugins_play to load vars for managed-node2 44071 1727204621.66836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204621.71754: done with get_vars() 44071 1727204621.71800: done getting variables 44071 1727204621.72083: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204621.72222: variable 'profile' from source: play vars 44071 1727204621.72227: variable 'interface' from source: play vars 44071 1727204621.72504: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.142) 0:00:34.041 ***** 44071 1727204621.72548: entering _queue_task() for managed-node2/command 44071 1727204621.73361: worker is 1 (out of 1 available) 44071 1727204621.73380: exiting _queue_task() for managed-node2/command 44071 1727204621.73394: done queuing things up, now waiting for results queue to drain 44071 1727204621.73396: waiting for pending results... 44071 1727204621.74089: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr 44071 1727204621.74372: in run() - task 127b8e07-fff9-c964-7471-00000000094e 44071 1727204621.74378: variable 'ansible_search_path' from source: unknown 44071 1727204621.74381: variable 'ansible_search_path' from source: unknown 44071 1727204621.74457: calling self._execute() 44071 1727204621.74683: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204621.74696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204621.74709: variable 'omit' from source: magic vars 44071 1727204621.75676: variable 'ansible_distribution_major_version' from source: facts 44071 1727204621.75701: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204621.75950: variable 'profile_stat' from source: set_fact 44071 1727204621.76015: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204621.76076: when evaluation is False, skipping this task 44071 1727204621.76088: _execute() done 44071 1727204621.76096: dumping result to json 44071 1727204621.76104: done dumping result, returning 44071 1727204621.76115: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr [127b8e07-fff9-c964-7471-00000000094e] 44071 1727204621.76125: sending task result for task 127b8e07-fff9-c964-7471-00000000094e 44071 1727204621.76360: done sending task result for task 127b8e07-fff9-c964-7471-00000000094e 44071 1727204621.76366: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204621.76428: no more pending results, returning what we have 44071 1727204621.76433: results queue empty 44071 1727204621.76434: checking for any_errors_fatal 44071 1727204621.76442: done checking for any_errors_fatal 44071 1727204621.76443: checking for max_fail_percentage 44071 1727204621.76445: done checking for max_fail_percentage 44071 1727204621.76447: checking to see if all hosts have failed and the running result is not ok 44071 1727204621.76448: done checking to see if all hosts have failed 44071 1727204621.76449: getting the remaining hosts for this loop 44071 1727204621.76450: done getting the remaining hosts for this loop 44071 1727204621.76456: getting the next task for host managed-node2 44071 1727204621.76467: done getting next task for host managed-node2 44071 1727204621.76470: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 44071 1727204621.76476: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204621.76481: getting variables 44071 1727204621.76483: in VariableManager get_vars() 44071 1727204621.76520: Calling all_inventory to load vars for managed-node2 44071 1727204621.76524: Calling groups_inventory to load vars for managed-node2 44071 1727204621.76528: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204621.76546: Calling all_plugins_play to load vars for managed-node2 44071 1727204621.76549: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204621.76551: Calling groups_plugins_play to load vars for managed-node2 44071 1727204621.79762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204621.82861: done with get_vars() 44071 1727204621.82907: done getting variables 44071 1727204621.82977: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204621.83118: variable 'profile' from source: play vars 44071 1727204621.83123: variable 'interface' from source: play vars 44071 1727204621.83192: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.106) 0:00:34.148 ***** 44071 1727204621.83233: entering _queue_task() for managed-node2/set_fact 44071 1727204621.83635: worker is 1 (out of 1 available) 44071 1727204621.83649: exiting _queue_task() for managed-node2/set_fact 44071 1727204621.83774: done queuing things up, now waiting for results queue to drain 44071 1727204621.83776: waiting for pending results... 44071 1727204621.84089: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr 44071 1727204621.84191: in run() - task 127b8e07-fff9-c964-7471-00000000094f 44071 1727204621.84217: variable 'ansible_search_path' from source: unknown 44071 1727204621.84225: variable 'ansible_search_path' from source: unknown 44071 1727204621.84275: calling self._execute() 44071 1727204621.84381: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204621.84397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204621.84472: variable 'omit' from source: magic vars 44071 1727204621.84844: variable 'ansible_distribution_major_version' from source: facts 44071 1727204621.84864: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204621.85011: variable 'profile_stat' from source: set_fact 44071 1727204621.85034: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204621.85042: when evaluation is False, skipping this task 44071 1727204621.85049: _execute() done 44071 1727204621.85057: dumping result to json 44071 1727204621.85130: done dumping result, returning 44071 1727204621.85134: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr [127b8e07-fff9-c964-7471-00000000094f] 44071 1727204621.85136: sending task result for task 127b8e07-fff9-c964-7471-00000000094f 44071 1727204621.85328: done sending task result for task 127b8e07-fff9-c964-7471-00000000094f 44071 1727204621.85332: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204621.85503: no more pending results, returning what we have 44071 1727204621.85508: results queue empty 44071 1727204621.85510: checking for any_errors_fatal 44071 1727204621.85517: done checking for any_errors_fatal 44071 1727204621.85518: checking for max_fail_percentage 44071 1727204621.85520: done checking for max_fail_percentage 44071 1727204621.85521: checking to see if all hosts have failed and the running result is not ok 44071 1727204621.85522: done checking to see if all hosts have failed 44071 1727204621.85523: getting the remaining hosts for this loop 44071 1727204621.85524: done getting the remaining hosts for this loop 44071 1727204621.85530: getting the next task for host managed-node2 44071 1727204621.85542: done getting next task for host managed-node2 44071 1727204621.85548: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 44071 1727204621.85553: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204621.85558: getting variables 44071 1727204621.85560: in VariableManager get_vars() 44071 1727204621.85797: Calling all_inventory to load vars for managed-node2 44071 1727204621.85800: Calling groups_inventory to load vars for managed-node2 44071 1727204621.85804: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204621.85815: Calling all_plugins_play to load vars for managed-node2 44071 1727204621.85818: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204621.85821: Calling groups_plugins_play to load vars for managed-node2 44071 1727204621.87879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204621.90085: done with get_vars() 44071 1727204621.90205: done getting variables 44071 1727204621.90333: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204621.90721: variable 'profile' from source: play vars 44071 1727204621.90726: variable 'interface' from source: play vars 44071 1727204621.90899: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.077) 0:00:34.225 ***** 44071 1727204621.90936: entering _queue_task() for managed-node2/assert 44071 1727204621.91825: worker is 1 (out of 1 available) 44071 1727204621.91841: exiting _queue_task() for managed-node2/assert 44071 1727204621.91854: done queuing things up, now waiting for results queue to drain 44071 1727204621.91856: waiting for pending results... 44071 1727204621.92418: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'statebr' 44071 1727204621.92551: in run() - task 127b8e07-fff9-c964-7471-0000000008ae 44071 1727204621.92672: variable 'ansible_search_path' from source: unknown 44071 1727204621.92676: variable 'ansible_search_path' from source: unknown 44071 1727204621.92679: calling self._execute() 44071 1727204621.93013: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204621.93018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204621.93028: variable 'omit' from source: magic vars 44071 1727204621.94035: variable 'ansible_distribution_major_version' from source: facts 44071 1727204621.94050: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204621.94058: variable 'omit' from source: magic vars 44071 1727204621.94237: variable 'omit' from source: magic vars 44071 1727204621.94511: variable 'profile' from source: play vars 44071 1727204621.94516: variable 'interface' from source: play vars 44071 1727204621.94752: variable 'interface' from source: play vars 44071 1727204621.94757: variable 'omit' from source: magic vars 44071 1727204621.94836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204621.94921: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204621.94970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204621.94977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204621.94980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204621.95013: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204621.95016: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204621.95019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204621.95191: Set connection var ansible_connection to ssh 44071 1727204621.95195: Set connection var ansible_timeout to 10 44071 1727204621.95197: Set connection var ansible_pipelining to False 44071 1727204621.95199: Set connection var ansible_shell_type to sh 44071 1727204621.95201: Set connection var ansible_shell_executable to /bin/sh 44071 1727204621.95204: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204621.95221: variable 'ansible_shell_executable' from source: unknown 44071 1727204621.95225: variable 'ansible_connection' from source: unknown 44071 1727204621.95227: variable 'ansible_module_compression' from source: unknown 44071 1727204621.95229: variable 'ansible_shell_type' from source: unknown 44071 1727204621.95232: variable 'ansible_shell_executable' from source: unknown 44071 1727204621.95235: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204621.95271: variable 'ansible_pipelining' from source: unknown 44071 1727204621.95275: variable 'ansible_timeout' from source: unknown 44071 1727204621.95277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204621.95413: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204621.95430: variable 'omit' from source: magic vars 44071 1727204621.95436: starting attempt loop 44071 1727204621.95439: running the handler 44071 1727204621.95627: variable 'lsr_net_profile_exists' from source: set_fact 44071 1727204621.95630: Evaluated conditional (lsr_net_profile_exists): True 44071 1727204621.95633: handler run complete 44071 1727204621.95635: attempt loop complete, returning result 44071 1727204621.95637: _execute() done 44071 1727204621.95640: dumping result to json 44071 1727204621.95643: done dumping result, returning 44071 1727204621.95645: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'statebr' [127b8e07-fff9-c964-7471-0000000008ae] 44071 1727204621.95647: sending task result for task 127b8e07-fff9-c964-7471-0000000008ae 44071 1727204621.95739: done sending task result for task 127b8e07-fff9-c964-7471-0000000008ae 44071 1727204621.95742: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204621.96009: no more pending results, returning what we have 44071 1727204621.96012: results queue empty 44071 1727204621.96013: checking for any_errors_fatal 44071 1727204621.96019: done checking for any_errors_fatal 44071 1727204621.96020: checking for max_fail_percentage 44071 1727204621.96021: done checking for max_fail_percentage 44071 1727204621.96024: checking to see if all hosts have failed and the running result is not ok 44071 1727204621.96024: done checking to see if all hosts have failed 44071 1727204621.96025: getting the remaining hosts for this loop 44071 1727204621.96026: done getting the remaining hosts for this loop 44071 1727204621.96030: getting the next task for host managed-node2 44071 1727204621.96037: done getting next task for host managed-node2 44071 1727204621.96040: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 44071 1727204621.96043: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204621.96047: getting variables 44071 1727204621.96048: in VariableManager get_vars() 44071 1727204621.96087: Calling all_inventory to load vars for managed-node2 44071 1727204621.96089: Calling groups_inventory to load vars for managed-node2 44071 1727204621.96093: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204621.96104: Calling all_plugins_play to load vars for managed-node2 44071 1727204621.96106: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204621.96109: Calling groups_plugins_play to load vars for managed-node2 44071 1727204621.98468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204622.00527: done with get_vars() 44071 1727204622.00556: done getting variables 44071 1727204622.00611: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204622.00711: variable 'profile' from source: play vars 44071 1727204622.00715: variable 'interface' from source: play vars 44071 1727204622.00759: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 15:03:42 -0400 (0:00:00.098) 0:00:34.324 ***** 44071 1727204622.00793: entering _queue_task() for managed-node2/assert 44071 1727204622.01086: worker is 1 (out of 1 available) 44071 1727204622.01103: exiting _queue_task() for managed-node2/assert 44071 1727204622.01116: done queuing things up, now waiting for results queue to drain 44071 1727204622.01118: waiting for pending results... 44071 1727204622.01341: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'statebr' 44071 1727204622.01487: in run() - task 127b8e07-fff9-c964-7471-0000000008af 44071 1727204622.01509: variable 'ansible_search_path' from source: unknown 44071 1727204622.01515: variable 'ansible_search_path' from source: unknown 44071 1727204622.01566: calling self._execute() 44071 1727204622.01693: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204622.01710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204622.01724: variable 'omit' from source: magic vars 44071 1727204622.02374: variable 'ansible_distribution_major_version' from source: facts 44071 1727204622.02381: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204622.02385: variable 'omit' from source: magic vars 44071 1727204622.02439: variable 'omit' from source: magic vars 44071 1727204622.02571: variable 'profile' from source: play vars 44071 1727204622.02575: variable 'interface' from source: play vars 44071 1727204622.02649: variable 'interface' from source: play vars 44071 1727204622.02674: variable 'omit' from source: magic vars 44071 1727204622.02832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204622.03048: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204622.03052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204622.03056: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204622.03213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204622.03258: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204622.03284: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204622.03356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204622.03407: Set connection var ansible_connection to ssh 44071 1727204622.03432: Set connection var ansible_timeout to 10 44071 1727204622.03436: Set connection var ansible_pipelining to False 44071 1727204622.03445: Set connection var ansible_shell_type to sh 44071 1727204622.03447: Set connection var ansible_shell_executable to /bin/sh 44071 1727204622.03450: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204622.03515: variable 'ansible_shell_executable' from source: unknown 44071 1727204622.03518: variable 'ansible_connection' from source: unknown 44071 1727204622.03521: variable 'ansible_module_compression' from source: unknown 44071 1727204622.03524: variable 'ansible_shell_type' from source: unknown 44071 1727204622.03526: variable 'ansible_shell_executable' from source: unknown 44071 1727204622.03533: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204622.03535: variable 'ansible_pipelining' from source: unknown 44071 1727204622.03538: variable 'ansible_timeout' from source: unknown 44071 1727204622.03543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204622.03694: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204622.03764: variable 'omit' from source: magic vars 44071 1727204622.03771: starting attempt loop 44071 1727204622.03774: running the handler 44071 1727204622.03880: variable 'lsr_net_profile_ansible_managed' from source: set_fact 44071 1727204622.03886: Evaluated conditional (lsr_net_profile_ansible_managed): True 44071 1727204622.03898: handler run complete 44071 1727204622.03934: attempt loop complete, returning result 44071 1727204622.03953: _execute() done 44071 1727204622.03957: dumping result to json 44071 1727204622.03960: done dumping result, returning 44071 1727204622.03971: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'statebr' [127b8e07-fff9-c964-7471-0000000008af] 44071 1727204622.03974: sending task result for task 127b8e07-fff9-c964-7471-0000000008af ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204622.04183: no more pending results, returning what we have 44071 1727204622.04186: results queue empty 44071 1727204622.04187: checking for any_errors_fatal 44071 1727204622.04193: done checking for any_errors_fatal 44071 1727204622.04199: checking for max_fail_percentage 44071 1727204622.04201: done checking for max_fail_percentage 44071 1727204622.04202: checking to see if all hosts have failed and the running result is not ok 44071 1727204622.04203: done checking to see if all hosts have failed 44071 1727204622.04204: getting the remaining hosts for this loop 44071 1727204622.04205: done getting the remaining hosts for this loop 44071 1727204622.04214: getting the next task for host managed-node2 44071 1727204622.04222: done getting next task for host managed-node2 44071 1727204622.04225: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 44071 1727204622.04230: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204622.04236: getting variables 44071 1727204622.04237: in VariableManager get_vars() 44071 1727204622.04281: Calling all_inventory to load vars for managed-node2 44071 1727204622.04285: Calling groups_inventory to load vars for managed-node2 44071 1727204622.04289: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204622.04301: Calling all_plugins_play to load vars for managed-node2 44071 1727204622.04303: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204622.04306: Calling groups_plugins_play to load vars for managed-node2 44071 1727204622.04874: done sending task result for task 127b8e07-fff9-c964-7471-0000000008af 44071 1727204622.04878: WORKER PROCESS EXITING 44071 1727204622.06680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204622.08259: done with get_vars() 44071 1727204622.08289: done getting variables 44071 1727204622.08339: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204622.08435: variable 'profile' from source: play vars 44071 1727204622.08439: variable 'interface' from source: play vars 44071 1727204622.08490: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 15:03:42 -0400 (0:00:00.077) 0:00:34.401 ***** 44071 1727204622.08518: entering _queue_task() for managed-node2/assert 44071 1727204622.08830: worker is 1 (out of 1 available) 44071 1727204622.08847: exiting _queue_task() for managed-node2/assert 44071 1727204622.08861: done queuing things up, now waiting for results queue to drain 44071 1727204622.08863: waiting for pending results... 44071 1727204622.09067: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in statebr 44071 1727204622.09177: in run() - task 127b8e07-fff9-c964-7471-0000000008b0 44071 1727204622.09195: variable 'ansible_search_path' from source: unknown 44071 1727204622.09198: variable 'ansible_search_path' from source: unknown 44071 1727204622.09247: calling self._execute() 44071 1727204622.09344: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204622.09349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204622.09358: variable 'omit' from source: magic vars 44071 1727204622.09794: variable 'ansible_distribution_major_version' from source: facts 44071 1727204622.09798: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204622.09801: variable 'omit' from source: magic vars 44071 1727204622.09842: variable 'omit' from source: magic vars 44071 1727204622.10125: variable 'profile' from source: play vars 44071 1727204622.10129: variable 'interface' from source: play vars 44071 1727204622.10161: variable 'interface' from source: play vars 44071 1727204622.10192: variable 'omit' from source: magic vars 44071 1727204622.10249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204622.10462: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204622.10584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204622.10610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204622.10627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204622.10669: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204622.10771: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204622.10775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204622.10992: Set connection var ansible_connection to ssh 44071 1727204622.11022: Set connection var ansible_timeout to 10 44071 1727204622.11043: Set connection var ansible_pipelining to False 44071 1727204622.11055: Set connection var ansible_shell_type to sh 44071 1727204622.11102: Set connection var ansible_shell_executable to /bin/sh 44071 1727204622.11113: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204622.11195: variable 'ansible_shell_executable' from source: unknown 44071 1727204622.11199: variable 'ansible_connection' from source: unknown 44071 1727204622.11201: variable 'ansible_module_compression' from source: unknown 44071 1727204622.11203: variable 'ansible_shell_type' from source: unknown 44071 1727204622.11206: variable 'ansible_shell_executable' from source: unknown 44071 1727204622.11208: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204622.11210: variable 'ansible_pipelining' from source: unknown 44071 1727204622.11212: variable 'ansible_timeout' from source: unknown 44071 1727204622.11217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204622.11678: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204622.11683: variable 'omit' from source: magic vars 44071 1727204622.11685: starting attempt loop 44071 1727204622.11689: running the handler 44071 1727204622.12055: variable 'lsr_net_profile_fingerprint' from source: set_fact 44071 1727204622.12061: Evaluated conditional (lsr_net_profile_fingerprint): True 44071 1727204622.12176: handler run complete 44071 1727204622.12224: attempt loop complete, returning result 44071 1727204622.12228: _execute() done 44071 1727204622.12230: dumping result to json 44071 1727204622.12233: done dumping result, returning 44071 1727204622.12235: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in statebr [127b8e07-fff9-c964-7471-0000000008b0] 44071 1727204622.12238: sending task result for task 127b8e07-fff9-c964-7471-0000000008b0 44071 1727204622.12610: done sending task result for task 127b8e07-fff9-c964-7471-0000000008b0 44071 1727204622.12614: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204622.12697: no more pending results, returning what we have 44071 1727204622.12702: results queue empty 44071 1727204622.12705: checking for any_errors_fatal 44071 1727204622.12714: done checking for any_errors_fatal 44071 1727204622.12716: checking for max_fail_percentage 44071 1727204622.12718: done checking for max_fail_percentage 44071 1727204622.12719: checking to see if all hosts have failed and the running result is not ok 44071 1727204622.12720: done checking to see if all hosts have failed 44071 1727204622.12720: getting the remaining hosts for this loop 44071 1727204622.12723: done getting the remaining hosts for this loop 44071 1727204622.12729: getting the next task for host managed-node2 44071 1727204622.12747: done getting next task for host managed-node2 44071 1727204622.12754: ^ task is: TASK: Conditional asserts 44071 1727204622.12758: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204622.12764: getting variables 44071 1727204622.12768: in VariableManager get_vars() 44071 1727204622.12809: Calling all_inventory to load vars for managed-node2 44071 1727204622.12813: Calling groups_inventory to load vars for managed-node2 44071 1727204622.12817: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204622.12835: Calling all_plugins_play to load vars for managed-node2 44071 1727204622.12839: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204622.12842: Calling groups_plugins_play to load vars for managed-node2 44071 1727204622.15780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204622.19099: done with get_vars() 44071 1727204622.19134: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 15:03:42 -0400 (0:00:00.107) 0:00:34.508 ***** 44071 1727204622.19252: entering _queue_task() for managed-node2/include_tasks 44071 1727204622.19627: worker is 1 (out of 1 available) 44071 1727204622.19643: exiting _queue_task() for managed-node2/include_tasks 44071 1727204622.19658: done queuing things up, now waiting for results queue to drain 44071 1727204622.19660: waiting for pending results... 44071 1727204622.19920: running TaskExecutor() for managed-node2/TASK: Conditional asserts 44071 1727204622.20020: in run() - task 127b8e07-fff9-c964-7471-0000000005ba 44071 1727204622.20031: variable 'ansible_search_path' from source: unknown 44071 1727204622.20036: variable 'ansible_search_path' from source: unknown 44071 1727204622.20384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204622.22636: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204622.22700: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204622.22735: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204622.22767: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204622.22789: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204622.22892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204622.22915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204622.22935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204622.22969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204622.22981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204622.23110: dumping result to json 44071 1727204622.23113: done dumping result, returning 44071 1727204622.23119: done running TaskExecutor() for managed-node2/TASK: Conditional asserts [127b8e07-fff9-c964-7471-0000000005ba] 44071 1727204622.23124: sending task result for task 127b8e07-fff9-c964-7471-0000000005ba 44071 1727204622.23238: done sending task result for task 127b8e07-fff9-c964-7471-0000000005ba 44071 1727204622.23242: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } 44071 1727204622.23308: no more pending results, returning what we have 44071 1727204622.23315: results queue empty 44071 1727204622.23317: checking for any_errors_fatal 44071 1727204622.23323: done checking for any_errors_fatal 44071 1727204622.23323: checking for max_fail_percentage 44071 1727204622.23325: done checking for max_fail_percentage 44071 1727204622.23326: checking to see if all hosts have failed and the running result is not ok 44071 1727204622.23327: done checking to see if all hosts have failed 44071 1727204622.23327: getting the remaining hosts for this loop 44071 1727204622.23329: done getting the remaining hosts for this loop 44071 1727204622.23335: getting the next task for host managed-node2 44071 1727204622.23342: done getting next task for host managed-node2 44071 1727204622.23345: ^ task is: TASK: Success in test '{{ lsr_description }}' 44071 1727204622.23348: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204622.23351: getting variables 44071 1727204622.23393: in VariableManager get_vars() 44071 1727204622.23425: Calling all_inventory to load vars for managed-node2 44071 1727204622.23428: Calling groups_inventory to load vars for managed-node2 44071 1727204622.23544: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204622.23557: Calling all_plugins_play to load vars for managed-node2 44071 1727204622.23561: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204622.23568: Calling groups_plugins_play to load vars for managed-node2 44071 1727204622.24907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204622.26470: done with get_vars() 44071 1727204622.26499: done getting variables 44071 1727204622.26548: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204622.26657: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile without autoconnect'] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 15:03:42 -0400 (0:00:00.074) 0:00:34.583 ***** 44071 1727204622.26686: entering _queue_task() for managed-node2/debug 44071 1727204622.27012: worker is 1 (out of 1 available) 44071 1727204622.27029: exiting _queue_task() for managed-node2/debug 44071 1727204622.27042: done queuing things up, now waiting for results queue to drain 44071 1727204622.27044: waiting for pending results... 44071 1727204622.27323: running TaskExecutor() for managed-node2/TASK: Success in test 'I can create a profile without autoconnect' 44071 1727204622.27402: in run() - task 127b8e07-fff9-c964-7471-0000000005bb 44071 1727204622.27417: variable 'ansible_search_path' from source: unknown 44071 1727204622.27421: variable 'ansible_search_path' from source: unknown 44071 1727204622.27454: calling self._execute() 44071 1727204622.27547: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204622.27551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204622.27561: variable 'omit' from source: magic vars 44071 1727204622.27924: variable 'ansible_distribution_major_version' from source: facts 44071 1727204622.27937: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204622.27947: variable 'omit' from source: magic vars 44071 1727204622.27983: variable 'omit' from source: magic vars 44071 1727204622.28063: variable 'lsr_description' from source: include params 44071 1727204622.28083: variable 'omit' from source: magic vars 44071 1727204622.28119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204622.28149: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204622.28168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204622.28188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204622.28198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204622.28223: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204622.28226: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204622.28229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204622.28311: Set connection var ansible_connection to ssh 44071 1727204622.28317: Set connection var ansible_timeout to 10 44071 1727204622.28322: Set connection var ansible_pipelining to False 44071 1727204622.28328: Set connection var ansible_shell_type to sh 44071 1727204622.28333: Set connection var ansible_shell_executable to /bin/sh 44071 1727204622.28342: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204622.28360: variable 'ansible_shell_executable' from source: unknown 44071 1727204622.28364: variable 'ansible_connection' from source: unknown 44071 1727204622.28368: variable 'ansible_module_compression' from source: unknown 44071 1727204622.28370: variable 'ansible_shell_type' from source: unknown 44071 1727204622.28373: variable 'ansible_shell_executable' from source: unknown 44071 1727204622.28375: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204622.28379: variable 'ansible_pipelining' from source: unknown 44071 1727204622.28385: variable 'ansible_timeout' from source: unknown 44071 1727204622.28387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204622.28529: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204622.28533: variable 'omit' from source: magic vars 44071 1727204622.28536: starting attempt loop 44071 1727204622.28538: running the handler 44071 1727204622.28576: handler run complete 44071 1727204622.28589: attempt loop complete, returning result 44071 1727204622.28592: _execute() done 44071 1727204622.28595: dumping result to json 44071 1727204622.28597: done dumping result, returning 44071 1727204622.28606: done running TaskExecutor() for managed-node2/TASK: Success in test 'I can create a profile without autoconnect' [127b8e07-fff9-c964-7471-0000000005bb] 44071 1727204622.28611: sending task result for task 127b8e07-fff9-c964-7471-0000000005bb 44071 1727204622.28710: done sending task result for task 127b8e07-fff9-c964-7471-0000000005bb 44071 1727204622.28713: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: +++++ Success in test 'I can create a profile without autoconnect' +++++ 44071 1727204622.28786: no more pending results, returning what we have 44071 1727204622.28790: results queue empty 44071 1727204622.28790: checking for any_errors_fatal 44071 1727204622.28799: done checking for any_errors_fatal 44071 1727204622.28800: checking for max_fail_percentage 44071 1727204622.28801: done checking for max_fail_percentage 44071 1727204622.28802: checking to see if all hosts have failed and the running result is not ok 44071 1727204622.28803: done checking to see if all hosts have failed 44071 1727204622.28804: getting the remaining hosts for this loop 44071 1727204622.28805: done getting the remaining hosts for this loop 44071 1727204622.28811: getting the next task for host managed-node2 44071 1727204622.28819: done getting next task for host managed-node2 44071 1727204622.28825: ^ task is: TASK: Cleanup 44071 1727204622.28828: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204622.28833: getting variables 44071 1727204622.28834: in VariableManager get_vars() 44071 1727204622.28871: Calling all_inventory to load vars for managed-node2 44071 1727204622.28873: Calling groups_inventory to load vars for managed-node2 44071 1727204622.28877: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204622.28889: Calling all_plugins_play to load vars for managed-node2 44071 1727204622.28891: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204622.28894: Calling groups_plugins_play to load vars for managed-node2 44071 1727204622.30332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204622.31813: done with get_vars() 44071 1727204622.31841: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 15:03:42 -0400 (0:00:00.052) 0:00:34.635 ***** 44071 1727204622.31923: entering _queue_task() for managed-node2/include_tasks 44071 1727204622.32258: worker is 1 (out of 1 available) 44071 1727204622.32277: exiting _queue_task() for managed-node2/include_tasks 44071 1727204622.32291: done queuing things up, now waiting for results queue to drain 44071 1727204622.32293: waiting for pending results... 44071 1727204622.32568: running TaskExecutor() for managed-node2/TASK: Cleanup 44071 1727204622.32705: in run() - task 127b8e07-fff9-c964-7471-0000000005bf 44071 1727204622.32723: variable 'ansible_search_path' from source: unknown 44071 1727204622.32727: variable 'ansible_search_path' from source: unknown 44071 1727204622.32796: variable 'lsr_cleanup' from source: include params 44071 1727204622.32990: variable 'lsr_cleanup' from source: include params 44071 1727204622.33088: variable 'omit' from source: magic vars 44071 1727204622.33217: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204622.33222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204622.33234: variable 'omit' from source: magic vars 44071 1727204622.33441: variable 'ansible_distribution_major_version' from source: facts 44071 1727204622.33445: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204622.33454: variable 'item' from source: unknown 44071 1727204622.33515: variable 'item' from source: unknown 44071 1727204622.33544: variable 'item' from source: unknown 44071 1727204622.33591: variable 'item' from source: unknown 44071 1727204622.33768: dumping result to json 44071 1727204622.33771: done dumping result, returning 44071 1727204622.33774: done running TaskExecutor() for managed-node2/TASK: Cleanup [127b8e07-fff9-c964-7471-0000000005bf] 44071 1727204622.33776: sending task result for task 127b8e07-fff9-c964-7471-0000000005bf 44071 1727204622.33823: done sending task result for task 127b8e07-fff9-c964-7471-0000000005bf 44071 1727204622.33826: WORKER PROCESS EXITING 44071 1727204622.33884: no more pending results, returning what we have 44071 1727204622.33889: in VariableManager get_vars() 44071 1727204622.33922: Calling all_inventory to load vars for managed-node2 44071 1727204622.33925: Calling groups_inventory to load vars for managed-node2 44071 1727204622.33927: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204622.33939: Calling all_plugins_play to load vars for managed-node2 44071 1727204622.33944: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204622.33947: Calling groups_plugins_play to load vars for managed-node2 44071 1727204622.35176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204622.36477: done with get_vars() 44071 1727204622.36502: variable 'ansible_search_path' from source: unknown 44071 1727204622.36503: variable 'ansible_search_path' from source: unknown 44071 1727204622.36547: we have included files to process 44071 1727204622.36548: generating all_blocks data 44071 1727204622.36550: done generating all_blocks data 44071 1727204622.36558: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204622.36559: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204622.36564: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204622.36814: done processing included file 44071 1727204622.36816: iterating over new_blocks loaded from include file 44071 1727204622.36818: in VariableManager get_vars() 44071 1727204622.36839: done with get_vars() 44071 1727204622.36841: filtering new block on tags 44071 1727204622.36875: done filtering new block on tags 44071 1727204622.36877: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed-node2 => (item=tasks/cleanup_profile+device.yml) 44071 1727204622.36881: extending task lists for all hosts with included blocks 44071 1727204622.38239: done extending task lists 44071 1727204622.38243: done processing included files 44071 1727204622.38244: results queue empty 44071 1727204622.38245: checking for any_errors_fatal 44071 1727204622.38251: done checking for any_errors_fatal 44071 1727204622.38251: checking for max_fail_percentage 44071 1727204622.38254: done checking for max_fail_percentage 44071 1727204622.38256: checking to see if all hosts have failed and the running result is not ok 44071 1727204622.38257: done checking to see if all hosts have failed 44071 1727204622.38257: getting the remaining hosts for this loop 44071 1727204622.38259: done getting the remaining hosts for this loop 44071 1727204622.38262: getting the next task for host managed-node2 44071 1727204622.38272: done getting next task for host managed-node2 44071 1727204622.38275: ^ task is: TASK: Cleanup profile and device 44071 1727204622.38278: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204622.38282: getting variables 44071 1727204622.38283: in VariableManager get_vars() 44071 1727204622.38298: Calling all_inventory to load vars for managed-node2 44071 1727204622.38303: Calling groups_inventory to load vars for managed-node2 44071 1727204622.38306: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204622.38315: Calling all_plugins_play to load vars for managed-node2 44071 1727204622.38317: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204622.38321: Calling groups_plugins_play to load vars for managed-node2 44071 1727204622.39634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204622.41785: done with get_vars() 44071 1727204622.41812: done getting variables 44071 1727204622.41856: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Tuesday 24 September 2024 15:03:42 -0400 (0:00:00.099) 0:00:34.735 ***** 44071 1727204622.41884: entering _queue_task() for managed-node2/shell 44071 1727204622.42181: worker is 1 (out of 1 available) 44071 1727204622.42197: exiting _queue_task() for managed-node2/shell 44071 1727204622.42211: done queuing things up, now waiting for results queue to drain 44071 1727204622.42213: waiting for pending results... 44071 1727204622.42413: running TaskExecutor() for managed-node2/TASK: Cleanup profile and device 44071 1727204622.42502: in run() - task 127b8e07-fff9-c964-7471-0000000009a0 44071 1727204622.42516: variable 'ansible_search_path' from source: unknown 44071 1727204622.42519: variable 'ansible_search_path' from source: unknown 44071 1727204622.42557: calling self._execute() 44071 1727204622.42638: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204622.42644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204622.42654: variable 'omit' from source: magic vars 44071 1727204622.42967: variable 'ansible_distribution_major_version' from source: facts 44071 1727204622.42980: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204622.42987: variable 'omit' from source: magic vars 44071 1727204622.43062: variable 'omit' from source: magic vars 44071 1727204622.43278: variable 'interface' from source: play vars 44071 1727204622.43282: variable 'omit' from source: magic vars 44071 1727204622.43293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204622.43336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204622.43370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204622.43405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204622.43422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204622.43461: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204622.43473: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204622.43481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204622.43617: Set connection var ansible_connection to ssh 44071 1727204622.43631: Set connection var ansible_timeout to 10 44071 1727204622.43645: Set connection var ansible_pipelining to False 44071 1727204622.43656: Set connection var ansible_shell_type to sh 44071 1727204622.43670: Set connection var ansible_shell_executable to /bin/sh 44071 1727204622.43683: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204622.43723: variable 'ansible_shell_executable' from source: unknown 44071 1727204622.43732: variable 'ansible_connection' from source: unknown 44071 1727204622.43738: variable 'ansible_module_compression' from source: unknown 44071 1727204622.43748: variable 'ansible_shell_type' from source: unknown 44071 1727204622.43756: variable 'ansible_shell_executable' from source: unknown 44071 1727204622.43763: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204622.43775: variable 'ansible_pipelining' from source: unknown 44071 1727204622.43782: variable 'ansible_timeout' from source: unknown 44071 1727204622.43789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204622.44148: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204622.44199: variable 'omit' from source: magic vars 44071 1727204622.44261: starting attempt loop 44071 1727204622.44264: running the handler 44071 1727204622.44278: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204622.44371: _low_level_execute_command(): starting 44071 1727204622.44375: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204622.45199: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204622.45229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204622.45264: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204622.45292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204622.45411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204622.47220: stdout chunk (state=3): >>>/root <<< 44071 1727204622.47416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204622.47496: stderr chunk (state=3): >>><<< 44071 1727204622.47500: stdout chunk (state=3): >>><<< 44071 1727204622.47638: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204622.47644: _low_level_execute_command(): starting 44071 1727204622.47647: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204622.4753268-46165-92168263791337 `" && echo ansible-tmp-1727204622.4753268-46165-92168263791337="` echo /root/.ansible/tmp/ansible-tmp-1727204622.4753268-46165-92168263791337 `" ) && sleep 0' 44071 1727204622.48184: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204622.48203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204622.48251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204622.48260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204622.48274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204622.48352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204622.50343: stdout chunk (state=3): >>>ansible-tmp-1727204622.4753268-46165-92168263791337=/root/.ansible/tmp/ansible-tmp-1727204622.4753268-46165-92168263791337 <<< 44071 1727204622.50500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204622.50578: stderr chunk (state=3): >>><<< 44071 1727204622.50616: stdout chunk (state=3): >>><<< 44071 1727204622.50619: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204622.4753268-46165-92168263791337=/root/.ansible/tmp/ansible-tmp-1727204622.4753268-46165-92168263791337 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204622.50679: variable 'ansible_module_compression' from source: unknown 44071 1727204622.50730: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204622.50767: variable 'ansible_facts' from source: unknown 44071 1727204622.50835: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204622.4753268-46165-92168263791337/AnsiballZ_command.py 44071 1727204622.50967: Sending initial data 44071 1727204622.50971: Sent initial data (155 bytes) 44071 1727204622.51460: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204622.51468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204622.51499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204622.51502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204622.51551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204622.51555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204622.51571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204622.51649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204622.53278: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204622.53355: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204622.53438: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpwfh6f_i4 /root/.ansible/tmp/ansible-tmp-1727204622.4753268-46165-92168263791337/AnsiballZ_command.py <<< 44071 1727204622.53442: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204622.4753268-46165-92168263791337/AnsiballZ_command.py" <<< 44071 1727204622.53522: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpwfh6f_i4" to remote "/root/.ansible/tmp/ansible-tmp-1727204622.4753268-46165-92168263791337/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204622.4753268-46165-92168263791337/AnsiballZ_command.py" <<< 44071 1727204622.54439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204622.54655: stderr chunk (state=3): >>><<< 44071 1727204622.54660: stdout chunk (state=3): >>><<< 44071 1727204622.54662: done transferring module to remote 44071 1727204622.54664: _low_level_execute_command(): starting 44071 1727204622.54669: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204622.4753268-46165-92168263791337/ /root/.ansible/tmp/ansible-tmp-1727204622.4753268-46165-92168263791337/AnsiballZ_command.py && sleep 0' 44071 1727204622.55368: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204622.55443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204622.55509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204622.55577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204622.55649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204622.57588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204622.57617: stdout chunk (state=3): >>><<< 44071 1727204622.57621: stderr chunk (state=3): >>><<< 44071 1727204622.57729: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204622.57734: _low_level_execute_command(): starting 44071 1727204622.57736: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204622.4753268-46165-92168263791337/AnsiballZ_command.py && sleep 0' 44071 1727204622.58371: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204622.58414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204622.58520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204622.58543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204622.58594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204622.58671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204622.78617: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (1b005dda-915c-4416-ac36-5cc535674185) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:03:42.748925", "end": "2024-09-24 15:03:42.784764", "delta": "0:00:00.035839", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204622.80198: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. <<< 44071 1727204622.80260: stderr chunk (state=3): >>><<< 44071 1727204622.80264: stdout chunk (state=3): >>><<< 44071 1727204622.80282: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (1b005dda-915c-4416-ac36-5cc535674185) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:03:42.748925", "end": "2024-09-24 15:03:42.784764", "delta": "0:00:00.035839", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. 44071 1727204622.80320: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204622.4753268-46165-92168263791337/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204622.80329: _low_level_execute_command(): starting 44071 1727204622.80335: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204622.4753268-46165-92168263791337/ > /dev/null 2>&1 && sleep 0' 44071 1727204622.80844: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204622.80848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204622.80857: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204622.80860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204622.80912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204622.80915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204622.80918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204622.81015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204622.82945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204622.83017: stderr chunk (state=3): >>><<< 44071 1727204622.83021: stdout chunk (state=3): >>><<< 44071 1727204622.83035: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204622.83045: handler run complete 44071 1727204622.83071: Evaluated conditional (False): False 44071 1727204622.83104: attempt loop complete, returning result 44071 1727204622.83108: _execute() done 44071 1727204622.83110: dumping result to json 44071 1727204622.83112: done dumping result, returning 44071 1727204622.83118: done running TaskExecutor() for managed-node2/TASK: Cleanup profile and device [127b8e07-fff9-c964-7471-0000000009a0] 44071 1727204622.83123: sending task result for task 127b8e07-fff9-c964-7471-0000000009a0 44071 1727204622.83254: done sending task result for task 127b8e07-fff9-c964-7471-0000000009a0 44071 1727204622.83260: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.035839", "end": "2024-09-24 15:03:42.784764", "rc": 1, "start": "2024-09-24 15:03:42.748925" } STDOUT: Connection 'statebr' (1b005dda-915c-4416-ac36-5cc535674185) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 44071 1727204622.83428: no more pending results, returning what we have 44071 1727204622.83431: results queue empty 44071 1727204622.83432: checking for any_errors_fatal 44071 1727204622.83433: done checking for any_errors_fatal 44071 1727204622.83434: checking for max_fail_percentage 44071 1727204622.83435: done checking for max_fail_percentage 44071 1727204622.83436: checking to see if all hosts have failed and the running result is not ok 44071 1727204622.83437: done checking to see if all hosts have failed 44071 1727204622.83437: getting the remaining hosts for this loop 44071 1727204622.83439: done getting the remaining hosts for this loop 44071 1727204622.83445: getting the next task for host managed-node2 44071 1727204622.83455: done getting next task for host managed-node2 44071 1727204622.83458: ^ task is: TASK: Include the task 'run_test.yml' 44071 1727204622.83460: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204622.83463: getting variables 44071 1727204622.83466: in VariableManager get_vars() 44071 1727204622.83497: Calling all_inventory to load vars for managed-node2 44071 1727204622.83500: Calling groups_inventory to load vars for managed-node2 44071 1727204622.83503: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204622.83514: Calling all_plugins_play to load vars for managed-node2 44071 1727204622.83516: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204622.83519: Calling groups_plugins_play to load vars for managed-node2 44071 1727204622.84814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204622.86242: done with get_vars() 44071 1727204622.86278: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:65 Tuesday 24 September 2024 15:03:42 -0400 (0:00:00.444) 0:00:35.180 ***** 44071 1727204622.86356: entering _queue_task() for managed-node2/include_tasks 44071 1727204622.86650: worker is 1 (out of 1 available) 44071 1727204622.86663: exiting _queue_task() for managed-node2/include_tasks 44071 1727204622.86680: done queuing things up, now waiting for results queue to drain 44071 1727204622.86682: waiting for pending results... 44071 1727204622.86914: running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' 44071 1727204622.86990: in run() - task 127b8e07-fff9-c964-7471-000000000011 44071 1727204622.87004: variable 'ansible_search_path' from source: unknown 44071 1727204622.87045: calling self._execute() 44071 1727204622.87123: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204622.87127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204622.87142: variable 'omit' from source: magic vars 44071 1727204622.87534: variable 'ansible_distribution_major_version' from source: facts 44071 1727204622.87560: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204622.87564: _execute() done 44071 1727204622.87573: dumping result to json 44071 1727204622.87576: done dumping result, returning 44071 1727204622.87579: done running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' [127b8e07-fff9-c964-7471-000000000011] 44071 1727204622.87582: sending task result for task 127b8e07-fff9-c964-7471-000000000011 44071 1727204622.87777: no more pending results, returning what we have 44071 1727204622.87785: in VariableManager get_vars() 44071 1727204622.87835: Calling all_inventory to load vars for managed-node2 44071 1727204622.87839: Calling groups_inventory to load vars for managed-node2 44071 1727204622.87844: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204622.87861: Calling all_plugins_play to load vars for managed-node2 44071 1727204622.87864: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204622.87871: Calling groups_plugins_play to load vars for managed-node2 44071 1727204622.88485: done sending task result for task 127b8e07-fff9-c964-7471-000000000011 44071 1727204622.88489: WORKER PROCESS EXITING 44071 1727204622.89322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204622.91820: done with get_vars() 44071 1727204622.91905: variable 'ansible_search_path' from source: unknown 44071 1727204622.91928: we have included files to process 44071 1727204622.91930: generating all_blocks data 44071 1727204622.91989: done generating all_blocks data 44071 1727204622.91998: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204622.92001: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204622.92008: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204622.92514: in VariableManager get_vars() 44071 1727204622.92529: done with get_vars() 44071 1727204622.92587: in VariableManager get_vars() 44071 1727204622.92611: done with get_vars() 44071 1727204622.92663: in VariableManager get_vars() 44071 1727204622.92688: done with get_vars() 44071 1727204622.92734: in VariableManager get_vars() 44071 1727204622.92746: done with get_vars() 44071 1727204622.92776: in VariableManager get_vars() 44071 1727204622.92795: done with get_vars() 44071 1727204622.93168: in VariableManager get_vars() 44071 1727204622.93183: done with get_vars() 44071 1727204622.93193: done processing included file 44071 1727204622.93194: iterating over new_blocks loaded from include file 44071 1727204622.93195: in VariableManager get_vars() 44071 1727204622.93203: done with get_vars() 44071 1727204622.93204: filtering new block on tags 44071 1727204622.93273: done filtering new block on tags 44071 1727204622.93275: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node2 44071 1727204622.93281: extending task lists for all hosts with included blocks 44071 1727204622.93324: done extending task lists 44071 1727204622.93325: done processing included files 44071 1727204622.93326: results queue empty 44071 1727204622.93326: checking for any_errors_fatal 44071 1727204622.93332: done checking for any_errors_fatal 44071 1727204622.93333: checking for max_fail_percentage 44071 1727204622.93334: done checking for max_fail_percentage 44071 1727204622.93335: checking to see if all hosts have failed and the running result is not ok 44071 1727204622.93335: done checking to see if all hosts have failed 44071 1727204622.93336: getting the remaining hosts for this loop 44071 1727204622.93337: done getting the remaining hosts for this loop 44071 1727204622.93340: getting the next task for host managed-node2 44071 1727204622.93344: done getting next task for host managed-node2 44071 1727204622.93347: ^ task is: TASK: TEST: {{ lsr_description }} 44071 1727204622.93349: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204622.93351: getting variables 44071 1727204622.93352: in VariableManager get_vars() 44071 1727204622.93362: Calling all_inventory to load vars for managed-node2 44071 1727204622.93364: Calling groups_inventory to load vars for managed-node2 44071 1727204622.93368: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204622.93375: Calling all_plugins_play to load vars for managed-node2 44071 1727204622.93378: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204622.93380: Calling groups_plugins_play to load vars for managed-node2 44071 1727204622.95774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204622.97956: done with get_vars() 44071 1727204622.97999: done getting variables 44071 1727204622.98047: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204622.98185: variable 'lsr_description' from source: include params TASK [TEST: I can activate an existing profile] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 15:03:42 -0400 (0:00:00.118) 0:00:35.298 ***** 44071 1727204622.98218: entering _queue_task() for managed-node2/debug 44071 1727204622.98623: worker is 1 (out of 1 available) 44071 1727204622.98637: exiting _queue_task() for managed-node2/debug 44071 1727204622.98655: done queuing things up, now waiting for results queue to drain 44071 1727204622.98657: waiting for pending results... 44071 1727204622.98988: running TaskExecutor() for managed-node2/TASK: TEST: I can activate an existing profile 44071 1727204622.99127: in run() - task 127b8e07-fff9-c964-7471-000000000a49 44071 1727204622.99154: variable 'ansible_search_path' from source: unknown 44071 1727204622.99162: variable 'ansible_search_path' from source: unknown 44071 1727204622.99212: calling self._execute() 44071 1727204622.99325: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204622.99345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204622.99360: variable 'omit' from source: magic vars 44071 1727204622.99785: variable 'ansible_distribution_major_version' from source: facts 44071 1727204622.99805: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204622.99817: variable 'omit' from source: magic vars 44071 1727204622.99858: variable 'omit' from source: magic vars 44071 1727204623.00071: variable 'lsr_description' from source: include params 44071 1727204623.00075: variable 'omit' from source: magic vars 44071 1727204623.00078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204623.00112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204623.00139: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204623.00163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.00185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.00227: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204623.00236: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.00245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.00371: Set connection var ansible_connection to ssh 44071 1727204623.00384: Set connection var ansible_timeout to 10 44071 1727204623.00394: Set connection var ansible_pipelining to False 44071 1727204623.00403: Set connection var ansible_shell_type to sh 44071 1727204623.00416: Set connection var ansible_shell_executable to /bin/sh 44071 1727204623.00430: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204623.00460: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.00471: variable 'ansible_connection' from source: unknown 44071 1727204623.00525: variable 'ansible_module_compression' from source: unknown 44071 1727204623.00527: variable 'ansible_shell_type' from source: unknown 44071 1727204623.00529: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.00531: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.00533: variable 'ansible_pipelining' from source: unknown 44071 1727204623.00534: variable 'ansible_timeout' from source: unknown 44071 1727204623.00536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.00659: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204623.00681: variable 'omit' from source: magic vars 44071 1727204623.00690: starting attempt loop 44071 1727204623.00697: running the handler 44071 1727204623.00758: handler run complete 44071 1727204623.00782: attempt loop complete, returning result 44071 1727204623.00971: _execute() done 44071 1727204623.00975: dumping result to json 44071 1727204623.00977: done dumping result, returning 44071 1727204623.00979: done running TaskExecutor() for managed-node2/TASK: TEST: I can activate an existing profile [127b8e07-fff9-c964-7471-000000000a49] 44071 1727204623.00982: sending task result for task 127b8e07-fff9-c964-7471-000000000a49 44071 1727204623.01061: done sending task result for task 127b8e07-fff9-c964-7471-000000000a49 44071 1727204623.01064: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ########## I can activate an existing profile ########## 44071 1727204623.01128: no more pending results, returning what we have 44071 1727204623.01133: results queue empty 44071 1727204623.01134: checking for any_errors_fatal 44071 1727204623.01135: done checking for any_errors_fatal 44071 1727204623.01136: checking for max_fail_percentage 44071 1727204623.01138: done checking for max_fail_percentage 44071 1727204623.01139: checking to see if all hosts have failed and the running result is not ok 44071 1727204623.01140: done checking to see if all hosts have failed 44071 1727204623.01141: getting the remaining hosts for this loop 44071 1727204623.01143: done getting the remaining hosts for this loop 44071 1727204623.01149: getting the next task for host managed-node2 44071 1727204623.01158: done getting next task for host managed-node2 44071 1727204623.01162: ^ task is: TASK: Show item 44071 1727204623.01168: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204623.01172: getting variables 44071 1727204623.01174: in VariableManager get_vars() 44071 1727204623.01216: Calling all_inventory to load vars for managed-node2 44071 1727204623.01219: Calling groups_inventory to load vars for managed-node2 44071 1727204623.01224: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204623.01240: Calling all_plugins_play to load vars for managed-node2 44071 1727204623.01244: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204623.01248: Calling groups_plugins_play to load vars for managed-node2 44071 1727204623.03185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204623.05329: done with get_vars() 44071 1727204623.05370: done getting variables 44071 1727204623.05440: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 15:03:43 -0400 (0:00:00.072) 0:00:35.371 ***** 44071 1727204623.05475: entering _queue_task() for managed-node2/debug 44071 1727204623.05872: worker is 1 (out of 1 available) 44071 1727204623.05888: exiting _queue_task() for managed-node2/debug 44071 1727204623.05903: done queuing things up, now waiting for results queue to drain 44071 1727204623.05905: waiting for pending results... 44071 1727204623.06291: running TaskExecutor() for managed-node2/TASK: Show item 44071 1727204623.06352: in run() - task 127b8e07-fff9-c964-7471-000000000a4a 44071 1727204623.06377: variable 'ansible_search_path' from source: unknown 44071 1727204623.06393: variable 'ansible_search_path' from source: unknown 44071 1727204623.06458: variable 'omit' from source: magic vars 44071 1727204623.06638: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.06658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.06678: variable 'omit' from source: magic vars 44071 1727204623.07103: variable 'ansible_distribution_major_version' from source: facts 44071 1727204623.07122: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204623.07132: variable 'omit' from source: magic vars 44071 1727204623.07181: variable 'omit' from source: magic vars 44071 1727204623.07236: variable 'item' from source: unknown 44071 1727204623.07325: variable 'item' from source: unknown 44071 1727204623.07351: variable 'omit' from source: magic vars 44071 1727204623.07472: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204623.07477: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204623.07480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204623.07503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.07522: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.07557: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204623.07569: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.07579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.07690: Set connection var ansible_connection to ssh 44071 1727204623.07705: Set connection var ansible_timeout to 10 44071 1727204623.07714: Set connection var ansible_pipelining to False 44071 1727204623.07722: Set connection var ansible_shell_type to sh 44071 1727204623.07730: Set connection var ansible_shell_executable to /bin/sh 44071 1727204623.07740: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204623.07768: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.07806: variable 'ansible_connection' from source: unknown 44071 1727204623.07809: variable 'ansible_module_compression' from source: unknown 44071 1727204623.07812: variable 'ansible_shell_type' from source: unknown 44071 1727204623.07814: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.07816: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.07818: variable 'ansible_pipelining' from source: unknown 44071 1727204623.07820: variable 'ansible_timeout' from source: unknown 44071 1727204623.07823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.07976: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204623.08025: variable 'omit' from source: magic vars 44071 1727204623.08028: starting attempt loop 44071 1727204623.08030: running the handler 44071 1727204623.08071: variable 'lsr_description' from source: include params 44071 1727204623.08168: variable 'lsr_description' from source: include params 44071 1727204623.08241: handler run complete 44071 1727204623.08244: attempt loop complete, returning result 44071 1727204623.08247: variable 'item' from source: unknown 44071 1727204623.08309: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can activate an existing profile" } 44071 1727204623.08694: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.08698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.08700: variable 'omit' from source: magic vars 44071 1727204623.08775: variable 'ansible_distribution_major_version' from source: facts 44071 1727204623.08787: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204623.08797: variable 'omit' from source: magic vars 44071 1727204623.08824: variable 'omit' from source: magic vars 44071 1727204623.08874: variable 'item' from source: unknown 44071 1727204623.08948: variable 'item' from source: unknown 44071 1727204623.09034: variable 'omit' from source: magic vars 44071 1727204623.09037: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204623.09040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.09043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.09045: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204623.09048: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.09050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.09138: Set connection var ansible_connection to ssh 44071 1727204623.09153: Set connection var ansible_timeout to 10 44071 1727204623.09161: Set connection var ansible_pipelining to False 44071 1727204623.09171: Set connection var ansible_shell_type to sh 44071 1727204623.09179: Set connection var ansible_shell_executable to /bin/sh 44071 1727204623.09188: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204623.09210: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.09215: variable 'ansible_connection' from source: unknown 44071 1727204623.09221: variable 'ansible_module_compression' from source: unknown 44071 1727204623.09228: variable 'ansible_shell_type' from source: unknown 44071 1727204623.09233: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.09252: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.09255: variable 'ansible_pipelining' from source: unknown 44071 1727204623.09257: variable 'ansible_timeout' from source: unknown 44071 1727204623.09258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.09569: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204623.09573: variable 'omit' from source: magic vars 44071 1727204623.09575: starting attempt loop 44071 1727204623.09578: running the handler 44071 1727204623.09579: variable 'lsr_setup' from source: include params 44071 1727204623.09581: variable 'lsr_setup' from source: include params 44071 1727204623.09583: handler run complete 44071 1727204623.09584: attempt loop complete, returning result 44071 1727204623.09586: variable 'item' from source: unknown 44071 1727204623.09651: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml" ] } 44071 1727204623.10046: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.10075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.10155: variable 'omit' from source: magic vars 44071 1727204623.10592: variable 'ansible_distribution_major_version' from source: facts 44071 1727204623.10596: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204623.10598: variable 'omit' from source: magic vars 44071 1727204623.10601: variable 'omit' from source: magic vars 44071 1727204623.10603: variable 'item' from source: unknown 44071 1727204623.10746: variable 'item' from source: unknown 44071 1727204623.10770: variable 'omit' from source: magic vars 44071 1727204623.10835: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204623.10929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.10941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.10961: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204623.10970: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.10977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.11183: Set connection var ansible_connection to ssh 44071 1727204623.11195: Set connection var ansible_timeout to 10 44071 1727204623.11204: Set connection var ansible_pipelining to False 44071 1727204623.11213: Set connection var ansible_shell_type to sh 44071 1727204623.11223: Set connection var ansible_shell_executable to /bin/sh 44071 1727204623.11234: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204623.11473: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.11476: variable 'ansible_connection' from source: unknown 44071 1727204623.11479: variable 'ansible_module_compression' from source: unknown 44071 1727204623.11481: variable 'ansible_shell_type' from source: unknown 44071 1727204623.11483: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.11485: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.11487: variable 'ansible_pipelining' from source: unknown 44071 1727204623.11489: variable 'ansible_timeout' from source: unknown 44071 1727204623.11491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.11546: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204623.11801: variable 'omit' from source: magic vars 44071 1727204623.11804: starting attempt loop 44071 1727204623.11806: running the handler 44071 1727204623.11809: variable 'lsr_test' from source: include params 44071 1727204623.11830: variable 'lsr_test' from source: include params 44071 1727204623.11855: handler run complete 44071 1727204623.11878: attempt loop complete, returning result 44071 1727204623.11900: variable 'item' from source: unknown 44071 1727204623.11979: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/activate_profile.yml" ] } 44071 1727204623.12167: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.12205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.12208: variable 'omit' from source: magic vars 44071 1727204623.12379: variable 'ansible_distribution_major_version' from source: facts 44071 1727204623.12390: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204623.12424: variable 'omit' from source: magic vars 44071 1727204623.12428: variable 'omit' from source: magic vars 44071 1727204623.12472: variable 'item' from source: unknown 44071 1727204623.12548: variable 'item' from source: unknown 44071 1727204623.12571: variable 'omit' from source: magic vars 44071 1727204623.12640: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204623.12644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.12646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.12649: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204623.12651: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.12654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.12735: Set connection var ansible_connection to ssh 44071 1727204623.12750: Set connection var ansible_timeout to 10 44071 1727204623.12761: Set connection var ansible_pipelining to False 44071 1727204623.12772: Set connection var ansible_shell_type to sh 44071 1727204623.12782: Set connection var ansible_shell_executable to /bin/sh 44071 1727204623.12857: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204623.12860: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.12862: variable 'ansible_connection' from source: unknown 44071 1727204623.12863: variable 'ansible_module_compression' from source: unknown 44071 1727204623.12868: variable 'ansible_shell_type' from source: unknown 44071 1727204623.12870: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.12872: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.12874: variable 'ansible_pipelining' from source: unknown 44071 1727204623.12876: variable 'ansible_timeout' from source: unknown 44071 1727204623.12878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.12962: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204623.12982: variable 'omit' from source: magic vars 44071 1727204623.12991: starting attempt loop 44071 1727204623.12999: running the handler 44071 1727204623.13024: variable 'lsr_assert' from source: include params 44071 1727204623.13112: variable 'lsr_assert' from source: include params 44071 1727204623.13138: handler run complete 44071 1727204623.13191: attempt loop complete, returning result 44071 1727204623.13195: variable 'item' from source: unknown 44071 1727204623.13259: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_present.yml" ] } 44071 1727204623.13572: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.13576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.13578: variable 'omit' from source: magic vars 44071 1727204623.13685: variable 'ansible_distribution_major_version' from source: facts 44071 1727204623.13703: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204623.13770: variable 'omit' from source: magic vars 44071 1727204623.13773: variable 'omit' from source: magic vars 44071 1727204623.13783: variable 'item' from source: unknown 44071 1727204623.13858: variable 'item' from source: unknown 44071 1727204623.13880: variable 'omit' from source: magic vars 44071 1727204623.13909: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204623.13925: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.13935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.13953: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204623.13960: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.13969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.14056: Set connection var ansible_connection to ssh 44071 1727204623.14133: Set connection var ansible_timeout to 10 44071 1727204623.14136: Set connection var ansible_pipelining to False 44071 1727204623.14138: Set connection var ansible_shell_type to sh 44071 1727204623.14140: Set connection var ansible_shell_executable to /bin/sh 44071 1727204623.14143: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204623.14145: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.14147: variable 'ansible_connection' from source: unknown 44071 1727204623.14149: variable 'ansible_module_compression' from source: unknown 44071 1727204623.14151: variable 'ansible_shell_type' from source: unknown 44071 1727204623.14153: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.14155: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.14156: variable 'ansible_pipelining' from source: unknown 44071 1727204623.14161: variable 'ansible_timeout' from source: unknown 44071 1727204623.14172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.14282: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204623.14295: variable 'omit' from source: magic vars 44071 1727204623.14303: starting attempt loop 44071 1727204623.14310: running the handler 44071 1727204623.14436: handler run complete 44071 1727204623.14456: attempt loop complete, returning result 44071 1727204623.14671: variable 'item' from source: unknown 44071 1727204623.14674: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 44071 1727204623.14752: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.14755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.14757: variable 'omit' from source: magic vars 44071 1727204623.14971: variable 'ansible_distribution_major_version' from source: facts 44071 1727204623.14974: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204623.14979: variable 'omit' from source: magic vars 44071 1727204623.14984: variable 'omit' from source: magic vars 44071 1727204623.14986: variable 'item' from source: unknown 44071 1727204623.15044: variable 'item' from source: unknown 44071 1727204623.15067: variable 'omit' from source: magic vars 44071 1727204623.15095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204623.15107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.15116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.15131: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204623.15203: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.15206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.15231: Set connection var ansible_connection to ssh 44071 1727204623.15242: Set connection var ansible_timeout to 10 44071 1727204623.15251: Set connection var ansible_pipelining to False 44071 1727204623.15260: Set connection var ansible_shell_type to sh 44071 1727204623.15273: Set connection var ansible_shell_executable to /bin/sh 44071 1727204623.15285: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204623.15317: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.15325: variable 'ansible_connection' from source: unknown 44071 1727204623.15332: variable 'ansible_module_compression' from source: unknown 44071 1727204623.15338: variable 'ansible_shell_type' from source: unknown 44071 1727204623.15345: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.15352: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.15359: variable 'ansible_pipelining' from source: unknown 44071 1727204623.15368: variable 'ansible_timeout' from source: unknown 44071 1727204623.15377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.15531: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204623.15534: variable 'omit' from source: magic vars 44071 1727204623.15537: starting attempt loop 44071 1727204623.15539: running the handler 44071 1727204623.15545: variable 'lsr_fail_debug' from source: play vars 44071 1727204623.15625: variable 'lsr_fail_debug' from source: play vars 44071 1727204623.15653: handler run complete 44071 1727204623.15675: attempt loop complete, returning result 44071 1727204623.15694: variable 'item' from source: unknown 44071 1727204623.15856: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 44071 1727204623.15965: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.15968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.15979: variable 'omit' from source: magic vars 44071 1727204623.16145: variable 'ansible_distribution_major_version' from source: facts 44071 1727204623.16182: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204623.16186: variable 'omit' from source: magic vars 44071 1727204623.16188: variable 'omit' from source: magic vars 44071 1727204623.16235: variable 'item' from source: unknown 44071 1727204623.16310: variable 'item' from source: unknown 44071 1727204623.16399: variable 'omit' from source: magic vars 44071 1727204623.16402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204623.16405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.16407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.16414: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204623.16417: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.16419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.16493: Set connection var ansible_connection to ssh 44071 1727204623.16509: Set connection var ansible_timeout to 10 44071 1727204623.16520: Set connection var ansible_pipelining to False 44071 1727204623.16530: Set connection var ansible_shell_type to sh 44071 1727204623.16540: Set connection var ansible_shell_executable to /bin/sh 44071 1727204623.16552: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204623.16580: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.16588: variable 'ansible_connection' from source: unknown 44071 1727204623.16595: variable 'ansible_module_compression' from source: unknown 44071 1727204623.16602: variable 'ansible_shell_type' from source: unknown 44071 1727204623.16723: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.16726: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.16729: variable 'ansible_pipelining' from source: unknown 44071 1727204623.16731: variable 'ansible_timeout' from source: unknown 44071 1727204623.16733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.16741: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204623.16753: variable 'omit' from source: magic vars 44071 1727204623.16761: starting attempt loop 44071 1727204623.16769: running the handler 44071 1727204623.16793: variable 'lsr_cleanup' from source: include params 44071 1727204623.16874: variable 'lsr_cleanup' from source: include params 44071 1727204623.16897: handler run complete 44071 1727204623.16917: attempt loop complete, returning result 44071 1727204623.16942: variable 'item' from source: unknown 44071 1727204623.17014: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 44071 1727204623.17271: dumping result to json 44071 1727204623.17275: done dumping result, returning 44071 1727204623.17278: done running TaskExecutor() for managed-node2/TASK: Show item [127b8e07-fff9-c964-7471-000000000a4a] 44071 1727204623.17281: sending task result for task 127b8e07-fff9-c964-7471-000000000a4a 44071 1727204623.17340: done sending task result for task 127b8e07-fff9-c964-7471-000000000a4a 44071 1727204623.17343: WORKER PROCESS EXITING 44071 1727204623.17405: no more pending results, returning what we have 44071 1727204623.17409: results queue empty 44071 1727204623.17410: checking for any_errors_fatal 44071 1727204623.17417: done checking for any_errors_fatal 44071 1727204623.17418: checking for max_fail_percentage 44071 1727204623.17419: done checking for max_fail_percentage 44071 1727204623.17420: checking to see if all hosts have failed and the running result is not ok 44071 1727204623.17421: done checking to see if all hosts have failed 44071 1727204623.17421: getting the remaining hosts for this loop 44071 1727204623.17423: done getting the remaining hosts for this loop 44071 1727204623.17429: getting the next task for host managed-node2 44071 1727204623.17436: done getting next task for host managed-node2 44071 1727204623.17439: ^ task is: TASK: Include the task 'show_interfaces.yml' 44071 1727204623.17443: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204623.17446: getting variables 44071 1727204623.17448: in VariableManager get_vars() 44071 1727204623.17486: Calling all_inventory to load vars for managed-node2 44071 1727204623.17489: Calling groups_inventory to load vars for managed-node2 44071 1727204623.17494: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204623.17509: Calling all_plugins_play to load vars for managed-node2 44071 1727204623.17512: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204623.17515: Calling groups_plugins_play to load vars for managed-node2 44071 1727204623.20759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204623.25176: done with get_vars() 44071 1727204623.25219: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 15:03:43 -0400 (0:00:00.198) 0:00:35.569 ***** 44071 1727204623.25321: entering _queue_task() for managed-node2/include_tasks 44071 1727204623.26220: worker is 1 (out of 1 available) 44071 1727204623.26238: exiting _queue_task() for managed-node2/include_tasks 44071 1727204623.26255: done queuing things up, now waiting for results queue to drain 44071 1727204623.26257: waiting for pending results... 44071 1727204623.27608: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 44071 1727204623.27616: in run() - task 127b8e07-fff9-c964-7471-000000000a4b 44071 1727204623.27620: variable 'ansible_search_path' from source: unknown 44071 1727204623.27624: variable 'ansible_search_path' from source: unknown 44071 1727204623.27736: calling self._execute() 44071 1727204623.28147: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.28151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.28155: variable 'omit' from source: magic vars 44071 1727204623.28926: variable 'ansible_distribution_major_version' from source: facts 44071 1727204623.28950: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204623.28962: _execute() done 44071 1727204623.28977: dumping result to json 44071 1727204623.28986: done dumping result, returning 44071 1727204623.28998: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-c964-7471-000000000a4b] 44071 1727204623.29014: sending task result for task 127b8e07-fff9-c964-7471-000000000a4b 44071 1727204623.29306: no more pending results, returning what we have 44071 1727204623.29313: in VariableManager get_vars() 44071 1727204623.29357: Calling all_inventory to load vars for managed-node2 44071 1727204623.29361: Calling groups_inventory to load vars for managed-node2 44071 1727204623.29367: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204623.29387: Calling all_plugins_play to load vars for managed-node2 44071 1727204623.29391: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204623.29396: Calling groups_plugins_play to load vars for managed-node2 44071 1727204623.29986: done sending task result for task 127b8e07-fff9-c964-7471-000000000a4b 44071 1727204623.29990: WORKER PROCESS EXITING 44071 1727204623.40458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204623.43585: done with get_vars() 44071 1727204623.43628: variable 'ansible_search_path' from source: unknown 44071 1727204623.43630: variable 'ansible_search_path' from source: unknown 44071 1727204623.43887: we have included files to process 44071 1727204623.43889: generating all_blocks data 44071 1727204623.43890: done generating all_blocks data 44071 1727204623.43894: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204623.43895: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204623.43897: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204623.44011: in VariableManager get_vars() 44071 1727204623.44035: done with get_vars() 44071 1727204623.44154: done processing included file 44071 1727204623.44156: iterating over new_blocks loaded from include file 44071 1727204623.44157: in VariableManager get_vars() 44071 1727204623.44303: done with get_vars() 44071 1727204623.44305: filtering new block on tags 44071 1727204623.44344: done filtering new block on tags 44071 1727204623.44347: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 44071 1727204623.44352: extending task lists for all hosts with included blocks 44071 1727204623.44873: done extending task lists 44071 1727204623.44875: done processing included files 44071 1727204623.44876: results queue empty 44071 1727204623.44877: checking for any_errors_fatal 44071 1727204623.44883: done checking for any_errors_fatal 44071 1727204623.44884: checking for max_fail_percentage 44071 1727204623.44885: done checking for max_fail_percentage 44071 1727204623.44886: checking to see if all hosts have failed and the running result is not ok 44071 1727204623.44887: done checking to see if all hosts have failed 44071 1727204623.44888: getting the remaining hosts for this loop 44071 1727204623.44890: done getting the remaining hosts for this loop 44071 1727204623.44892: getting the next task for host managed-node2 44071 1727204623.44897: done getting next task for host managed-node2 44071 1727204623.44900: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 44071 1727204623.44903: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204623.44905: getting variables 44071 1727204623.44906: in VariableManager get_vars() 44071 1727204623.44920: Calling all_inventory to load vars for managed-node2 44071 1727204623.44922: Calling groups_inventory to load vars for managed-node2 44071 1727204623.44925: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204623.44933: Calling all_plugins_play to load vars for managed-node2 44071 1727204623.44935: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204623.44939: Calling groups_plugins_play to load vars for managed-node2 44071 1727204623.46493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204623.49983: done with get_vars() 44071 1727204623.50024: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:03:43 -0400 (0:00:00.248) 0:00:35.817 ***** 44071 1727204623.50124: entering _queue_task() for managed-node2/include_tasks 44071 1727204623.50611: worker is 1 (out of 1 available) 44071 1727204623.50623: exiting _queue_task() for managed-node2/include_tasks 44071 1727204623.50637: done queuing things up, now waiting for results queue to drain 44071 1727204623.50639: waiting for pending results... 44071 1727204623.50882: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 44071 1727204623.51031: in run() - task 127b8e07-fff9-c964-7471-000000000a72 44071 1727204623.51056: variable 'ansible_search_path' from source: unknown 44071 1727204623.51066: variable 'ansible_search_path' from source: unknown 44071 1727204623.51118: calling self._execute() 44071 1727204623.51242: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.51260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.51278: variable 'omit' from source: magic vars 44071 1727204623.51706: variable 'ansible_distribution_major_version' from source: facts 44071 1727204623.51729: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204623.51741: _execute() done 44071 1727204623.51762: dumping result to json 44071 1727204623.51767: done dumping result, returning 44071 1727204623.51972: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-c964-7471-000000000a72] 44071 1727204623.51976: sending task result for task 127b8e07-fff9-c964-7471-000000000a72 44071 1727204623.52064: done sending task result for task 127b8e07-fff9-c964-7471-000000000a72 44071 1727204623.52069: WORKER PROCESS EXITING 44071 1727204623.52101: no more pending results, returning what we have 44071 1727204623.52106: in VariableManager get_vars() 44071 1727204623.52149: Calling all_inventory to load vars for managed-node2 44071 1727204623.52152: Calling groups_inventory to load vars for managed-node2 44071 1727204623.52156: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204623.52176: Calling all_plugins_play to load vars for managed-node2 44071 1727204623.52180: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204623.52183: Calling groups_plugins_play to load vars for managed-node2 44071 1727204623.54161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204623.56302: done with get_vars() 44071 1727204623.56333: variable 'ansible_search_path' from source: unknown 44071 1727204623.56335: variable 'ansible_search_path' from source: unknown 44071 1727204623.56376: we have included files to process 44071 1727204623.56377: generating all_blocks data 44071 1727204623.56379: done generating all_blocks data 44071 1727204623.56380: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204623.56381: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204623.56384: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204623.56663: done processing included file 44071 1727204623.56667: iterating over new_blocks loaded from include file 44071 1727204623.56668: in VariableManager get_vars() 44071 1727204623.56686: done with get_vars() 44071 1727204623.56688: filtering new block on tags 44071 1727204623.56731: done filtering new block on tags 44071 1727204623.56735: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 44071 1727204623.56741: extending task lists for all hosts with included blocks 44071 1727204623.57131: done extending task lists 44071 1727204623.57132: done processing included files 44071 1727204623.57133: results queue empty 44071 1727204623.57134: checking for any_errors_fatal 44071 1727204623.57138: done checking for any_errors_fatal 44071 1727204623.57139: checking for max_fail_percentage 44071 1727204623.57140: done checking for max_fail_percentage 44071 1727204623.57141: checking to see if all hosts have failed and the running result is not ok 44071 1727204623.57141: done checking to see if all hosts have failed 44071 1727204623.57142: getting the remaining hosts for this loop 44071 1727204623.57144: done getting the remaining hosts for this loop 44071 1727204623.57147: getting the next task for host managed-node2 44071 1727204623.57152: done getting next task for host managed-node2 44071 1727204623.57154: ^ task is: TASK: Gather current interface info 44071 1727204623.57158: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204623.57161: getting variables 44071 1727204623.57162: in VariableManager get_vars() 44071 1727204623.57176: Calling all_inventory to load vars for managed-node2 44071 1727204623.57179: Calling groups_inventory to load vars for managed-node2 44071 1727204623.57182: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204623.57189: Calling all_plugins_play to load vars for managed-node2 44071 1727204623.57191: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204623.57195: Calling groups_plugins_play to load vars for managed-node2 44071 1727204623.60379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204623.62564: done with get_vars() 44071 1727204623.62605: done getting variables 44071 1727204623.62657: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:03:43 -0400 (0:00:00.125) 0:00:35.943 ***** 44071 1727204623.62696: entering _queue_task() for managed-node2/command 44071 1727204623.63094: worker is 1 (out of 1 available) 44071 1727204623.63109: exiting _queue_task() for managed-node2/command 44071 1727204623.63124: done queuing things up, now waiting for results queue to drain 44071 1727204623.63126: waiting for pending results... 44071 1727204623.63589: running TaskExecutor() for managed-node2/TASK: Gather current interface info 44071 1727204623.63611: in run() - task 127b8e07-fff9-c964-7471-000000000aad 44071 1727204623.63639: variable 'ansible_search_path' from source: unknown 44071 1727204623.63647: variable 'ansible_search_path' from source: unknown 44071 1727204623.63699: calling self._execute() 44071 1727204623.63823: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.63837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.63851: variable 'omit' from source: magic vars 44071 1727204623.64315: variable 'ansible_distribution_major_version' from source: facts 44071 1727204623.64337: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204623.64350: variable 'omit' from source: magic vars 44071 1727204623.64423: variable 'omit' from source: magic vars 44071 1727204623.64584: variable 'omit' from source: magic vars 44071 1727204623.64589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204623.64592: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204623.64609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204623.64634: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.64655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204623.64700: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204623.64709: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.64717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.64842: Set connection var ansible_connection to ssh 44071 1727204623.64856: Set connection var ansible_timeout to 10 44071 1727204623.64868: Set connection var ansible_pipelining to False 44071 1727204623.64880: Set connection var ansible_shell_type to sh 44071 1727204623.64891: Set connection var ansible_shell_executable to /bin/sh 44071 1727204623.64906: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204623.64943: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.64953: variable 'ansible_connection' from source: unknown 44071 1727204623.64960: variable 'ansible_module_compression' from source: unknown 44071 1727204623.64970: variable 'ansible_shell_type' from source: unknown 44071 1727204623.64977: variable 'ansible_shell_executable' from source: unknown 44071 1727204623.64983: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204623.64991: variable 'ansible_pipelining' from source: unknown 44071 1727204623.64998: variable 'ansible_timeout' from source: unknown 44071 1727204623.65006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204623.65242: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204623.65246: variable 'omit' from source: magic vars 44071 1727204623.65248: starting attempt loop 44071 1727204623.65250: running the handler 44071 1727204623.65253: _low_level_execute_command(): starting 44071 1727204623.65255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204623.66135: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204623.66173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204623.66195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204623.66224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204623.66498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204623.68742: stdout chunk (state=3): >>>/root <<< 44071 1727204623.68747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204623.68750: stdout chunk (state=3): >>><<< 44071 1727204623.68753: stderr chunk (state=3): >>><<< 44071 1727204623.68756: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204623.68759: _low_level_execute_command(): starting 44071 1727204623.68763: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204623.6864965-46205-178182449002132 `" && echo ansible-tmp-1727204623.6864965-46205-178182449002132="` echo /root/.ansible/tmp/ansible-tmp-1727204623.6864965-46205-178182449002132 `" ) && sleep 0' 44071 1727204623.70108: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204623.70352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204623.70387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204623.70500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204623.72583: stdout chunk (state=3): >>>ansible-tmp-1727204623.6864965-46205-178182449002132=/root/.ansible/tmp/ansible-tmp-1727204623.6864965-46205-178182449002132 <<< 44071 1727204623.72683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204623.72719: stderr chunk (state=3): >>><<< 44071 1727204623.72785: stdout chunk (state=3): >>><<< 44071 1727204623.72812: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204623.6864965-46205-178182449002132=/root/.ansible/tmp/ansible-tmp-1727204623.6864965-46205-178182449002132 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204623.72920: variable 'ansible_module_compression' from source: unknown 44071 1727204623.73173: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204623.73176: variable 'ansible_facts' from source: unknown 44071 1727204623.73332: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204623.6864965-46205-178182449002132/AnsiballZ_command.py 44071 1727204623.73517: Sending initial data 44071 1727204623.73526: Sent initial data (156 bytes) 44071 1727204623.74265: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204623.74413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204623.74460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204623.76109: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204623.76176: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204623.76250: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp0713uhqt /root/.ansible/tmp/ansible-tmp-1727204623.6864965-46205-178182449002132/AnsiballZ_command.py <<< 44071 1727204623.76261: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204623.6864965-46205-178182449002132/AnsiballZ_command.py" <<< 44071 1727204623.76378: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp0713uhqt" to remote "/root/.ansible/tmp/ansible-tmp-1727204623.6864965-46205-178182449002132/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204623.6864965-46205-178182449002132/AnsiballZ_command.py" <<< 44071 1727204623.77575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204623.77715: stderr chunk (state=3): >>><<< 44071 1727204623.77728: stdout chunk (state=3): >>><<< 44071 1727204623.77768: done transferring module to remote 44071 1727204623.77792: _low_level_execute_command(): starting 44071 1727204623.77806: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204623.6864965-46205-178182449002132/ /root/.ansible/tmp/ansible-tmp-1727204623.6864965-46205-178182449002132/AnsiballZ_command.py && sleep 0' 44071 1727204623.78534: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204623.78658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204623.78681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204623.78704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204623.78803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204623.80990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204623.81010: stdout chunk (state=3): >>><<< 44071 1727204623.81032: stderr chunk (state=3): >>><<< 44071 1727204623.81051: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204623.81110: _low_level_execute_command(): starting 44071 1727204623.81114: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204623.6864965-46205-178182449002132/AnsiballZ_command.py && sleep 0' 44071 1727204623.81859: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204623.81886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204623.81985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204623.82000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204623.82038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204623.82063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204623.82082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204623.82208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204623.99121: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:43.986210", "end": "2024-09-24 15:03:43.989743", "delta": "0:00:00.003533", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204624.00699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204624.00767: stderr chunk (state=3): >>><<< 44071 1727204624.00772: stdout chunk (state=3): >>><<< 44071 1727204624.00791: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:43.986210", "end": "2024-09-24 15:03:43.989743", "delta": "0:00:00.003533", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204624.00829: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204623.6864965-46205-178182449002132/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204624.00836: _low_level_execute_command(): starting 44071 1727204624.00842: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204623.6864965-46205-178182449002132/ > /dev/null 2>&1 && sleep 0' 44071 1727204624.01351: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204624.01355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204624.01358: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204624.01361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204624.01421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204624.01428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204624.01431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204624.01497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204624.03432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204624.03481: stderr chunk (state=3): >>><<< 44071 1727204624.03484: stdout chunk (state=3): >>><<< 44071 1727204624.03501: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204624.03509: handler run complete 44071 1727204624.03531: Evaluated conditional (False): False 44071 1727204624.03540: attempt loop complete, returning result 44071 1727204624.03546: _execute() done 44071 1727204624.03549: dumping result to json 44071 1727204624.03555: done dumping result, returning 44071 1727204624.03565: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [127b8e07-fff9-c964-7471-000000000aad] 44071 1727204624.03572: sending task result for task 127b8e07-fff9-c964-7471-000000000aad 44071 1727204624.03687: done sending task result for task 127b8e07-fff9-c964-7471-000000000aad 44071 1727204624.03690: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003533", "end": "2024-09-24 15:03:43.989743", "rc": 0, "start": "2024-09-24 15:03:43.986210" } STDOUT: bonding_masters eth0 lo 44071 1727204624.03775: no more pending results, returning what we have 44071 1727204624.03779: results queue empty 44071 1727204624.03780: checking for any_errors_fatal 44071 1727204624.03782: done checking for any_errors_fatal 44071 1727204624.03782: checking for max_fail_percentage 44071 1727204624.03784: done checking for max_fail_percentage 44071 1727204624.03785: checking to see if all hosts have failed and the running result is not ok 44071 1727204624.03785: done checking to see if all hosts have failed 44071 1727204624.03786: getting the remaining hosts for this loop 44071 1727204624.03788: done getting the remaining hosts for this loop 44071 1727204624.03792: getting the next task for host managed-node2 44071 1727204624.03807: done getting next task for host managed-node2 44071 1727204624.03810: ^ task is: TASK: Set current_interfaces 44071 1727204624.03816: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204624.03819: getting variables 44071 1727204624.03821: in VariableManager get_vars() 44071 1727204624.03852: Calling all_inventory to load vars for managed-node2 44071 1727204624.03855: Calling groups_inventory to load vars for managed-node2 44071 1727204624.03859: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204624.03873: Calling all_plugins_play to load vars for managed-node2 44071 1727204624.03876: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204624.03879: Calling groups_plugins_play to load vars for managed-node2 44071 1727204624.04911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204624.06247: done with get_vars() 44071 1727204624.06272: done getting variables 44071 1727204624.06325: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.436) 0:00:36.379 ***** 44071 1727204624.06352: entering _queue_task() for managed-node2/set_fact 44071 1727204624.06640: worker is 1 (out of 1 available) 44071 1727204624.06655: exiting _queue_task() for managed-node2/set_fact 44071 1727204624.06672: done queuing things up, now waiting for results queue to drain 44071 1727204624.06674: waiting for pending results... 44071 1727204624.06879: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 44071 1727204624.06979: in run() - task 127b8e07-fff9-c964-7471-000000000aae 44071 1727204624.06994: variable 'ansible_search_path' from source: unknown 44071 1727204624.06998: variable 'ansible_search_path' from source: unknown 44071 1727204624.07038: calling self._execute() 44071 1727204624.07121: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204624.07127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204624.07136: variable 'omit' from source: magic vars 44071 1727204624.07464: variable 'ansible_distribution_major_version' from source: facts 44071 1727204624.07478: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204624.07485: variable 'omit' from source: magic vars 44071 1727204624.07524: variable 'omit' from source: magic vars 44071 1727204624.07619: variable '_current_interfaces' from source: set_fact 44071 1727204624.07678: variable 'omit' from source: magic vars 44071 1727204624.07713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204624.07743: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204624.07764: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204624.07784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204624.07794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204624.07819: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204624.07822: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204624.07825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204624.07910: Set connection var ansible_connection to ssh 44071 1727204624.07917: Set connection var ansible_timeout to 10 44071 1727204624.07922: Set connection var ansible_pipelining to False 44071 1727204624.07928: Set connection var ansible_shell_type to sh 44071 1727204624.07934: Set connection var ansible_shell_executable to /bin/sh 44071 1727204624.07940: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204624.07963: variable 'ansible_shell_executable' from source: unknown 44071 1727204624.07969: variable 'ansible_connection' from source: unknown 44071 1727204624.07971: variable 'ansible_module_compression' from source: unknown 44071 1727204624.07974: variable 'ansible_shell_type' from source: unknown 44071 1727204624.07976: variable 'ansible_shell_executable' from source: unknown 44071 1727204624.07978: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204624.07981: variable 'ansible_pipelining' from source: unknown 44071 1727204624.07987: variable 'ansible_timeout' from source: unknown 44071 1727204624.07989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204624.08108: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204624.08122: variable 'omit' from source: magic vars 44071 1727204624.08127: starting attempt loop 44071 1727204624.08130: running the handler 44071 1727204624.08141: handler run complete 44071 1727204624.08153: attempt loop complete, returning result 44071 1727204624.08156: _execute() done 44071 1727204624.08159: dumping result to json 44071 1727204624.08162: done dumping result, returning 44071 1727204624.08171: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [127b8e07-fff9-c964-7471-000000000aae] 44071 1727204624.08174: sending task result for task 127b8e07-fff9-c964-7471-000000000aae 44071 1727204624.08270: done sending task result for task 127b8e07-fff9-c964-7471-000000000aae 44071 1727204624.08273: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 44071 1727204624.08332: no more pending results, returning what we have 44071 1727204624.08336: results queue empty 44071 1727204624.08336: checking for any_errors_fatal 44071 1727204624.08352: done checking for any_errors_fatal 44071 1727204624.08353: checking for max_fail_percentage 44071 1727204624.08355: done checking for max_fail_percentage 44071 1727204624.08356: checking to see if all hosts have failed and the running result is not ok 44071 1727204624.08356: done checking to see if all hosts have failed 44071 1727204624.08357: getting the remaining hosts for this loop 44071 1727204624.08359: done getting the remaining hosts for this loop 44071 1727204624.08364: getting the next task for host managed-node2 44071 1727204624.08382: done getting next task for host managed-node2 44071 1727204624.08385: ^ task is: TASK: Show current_interfaces 44071 1727204624.08390: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204624.08394: getting variables 44071 1727204624.08396: in VariableManager get_vars() 44071 1727204624.08429: Calling all_inventory to load vars for managed-node2 44071 1727204624.08432: Calling groups_inventory to load vars for managed-node2 44071 1727204624.08435: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204624.08447: Calling all_plugins_play to load vars for managed-node2 44071 1727204624.08450: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204624.08453: Calling groups_plugins_play to load vars for managed-node2 44071 1727204624.09480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204624.10707: done with get_vars() 44071 1727204624.10737: done getting variables 44071 1727204624.10794: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.044) 0:00:36.424 ***** 44071 1727204624.10820: entering _queue_task() for managed-node2/debug 44071 1727204624.11117: worker is 1 (out of 1 available) 44071 1727204624.11133: exiting _queue_task() for managed-node2/debug 44071 1727204624.11147: done queuing things up, now waiting for results queue to drain 44071 1727204624.11149: waiting for pending results... 44071 1727204624.11356: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 44071 1727204624.11443: in run() - task 127b8e07-fff9-c964-7471-000000000a73 44071 1727204624.11460: variable 'ansible_search_path' from source: unknown 44071 1727204624.11463: variable 'ansible_search_path' from source: unknown 44071 1727204624.11502: calling self._execute() 44071 1727204624.11588: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204624.11592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204624.11604: variable 'omit' from source: magic vars 44071 1727204624.11931: variable 'ansible_distribution_major_version' from source: facts 44071 1727204624.11943: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204624.11953: variable 'omit' from source: magic vars 44071 1727204624.11987: variable 'omit' from source: magic vars 44071 1727204624.12071: variable 'current_interfaces' from source: set_fact 44071 1727204624.12095: variable 'omit' from source: magic vars 44071 1727204624.12131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204624.12168: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204624.12186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204624.12201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204624.12212: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204624.12237: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204624.12240: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204624.12249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204624.12328: Set connection var ansible_connection to ssh 44071 1727204624.12334: Set connection var ansible_timeout to 10 44071 1727204624.12340: Set connection var ansible_pipelining to False 44071 1727204624.12348: Set connection var ansible_shell_type to sh 44071 1727204624.12353: Set connection var ansible_shell_executable to /bin/sh 44071 1727204624.12365: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204624.12386: variable 'ansible_shell_executable' from source: unknown 44071 1727204624.12390: variable 'ansible_connection' from source: unknown 44071 1727204624.12393: variable 'ansible_module_compression' from source: unknown 44071 1727204624.12396: variable 'ansible_shell_type' from source: unknown 44071 1727204624.12399: variable 'ansible_shell_executable' from source: unknown 44071 1727204624.12402: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204624.12404: variable 'ansible_pipelining' from source: unknown 44071 1727204624.12407: variable 'ansible_timeout' from source: unknown 44071 1727204624.12410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204624.12532: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204624.12544: variable 'omit' from source: magic vars 44071 1727204624.12550: starting attempt loop 44071 1727204624.12553: running the handler 44071 1727204624.12600: handler run complete 44071 1727204624.12613: attempt loop complete, returning result 44071 1727204624.12616: _execute() done 44071 1727204624.12619: dumping result to json 44071 1727204624.12621: done dumping result, returning 44071 1727204624.12629: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [127b8e07-fff9-c964-7471-000000000a73] 44071 1727204624.12634: sending task result for task 127b8e07-fff9-c964-7471-000000000a73 44071 1727204624.12734: done sending task result for task 127b8e07-fff9-c964-7471-000000000a73 44071 1727204624.12737: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 44071 1727204624.12794: no more pending results, returning what we have 44071 1727204624.12797: results queue empty 44071 1727204624.12798: checking for any_errors_fatal 44071 1727204624.12809: done checking for any_errors_fatal 44071 1727204624.12810: checking for max_fail_percentage 44071 1727204624.12811: done checking for max_fail_percentage 44071 1727204624.12812: checking to see if all hosts have failed and the running result is not ok 44071 1727204624.12812: done checking to see if all hosts have failed 44071 1727204624.12813: getting the remaining hosts for this loop 44071 1727204624.12815: done getting the remaining hosts for this loop 44071 1727204624.12820: getting the next task for host managed-node2 44071 1727204624.12831: done getting next task for host managed-node2 44071 1727204624.12834: ^ task is: TASK: Setup 44071 1727204624.12838: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204624.12842: getting variables 44071 1727204624.12843: in VariableManager get_vars() 44071 1727204624.12888: Calling all_inventory to load vars for managed-node2 44071 1727204624.12891: Calling groups_inventory to load vars for managed-node2 44071 1727204624.12895: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204624.12907: Calling all_plugins_play to load vars for managed-node2 44071 1727204624.12909: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204624.12912: Calling groups_plugins_play to load vars for managed-node2 44071 1727204624.14113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204624.15353: done with get_vars() 44071 1727204624.15395: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.046) 0:00:36.471 ***** 44071 1727204624.15497: entering _queue_task() for managed-node2/include_tasks 44071 1727204624.15895: worker is 1 (out of 1 available) 44071 1727204624.15911: exiting _queue_task() for managed-node2/include_tasks 44071 1727204624.15926: done queuing things up, now waiting for results queue to drain 44071 1727204624.15927: waiting for pending results... 44071 1727204624.16392: running TaskExecutor() for managed-node2/TASK: Setup 44071 1727204624.16398: in run() - task 127b8e07-fff9-c964-7471-000000000a4c 44071 1727204624.16401: variable 'ansible_search_path' from source: unknown 44071 1727204624.16410: variable 'ansible_search_path' from source: unknown 44071 1727204624.16472: variable 'lsr_setup' from source: include params 44071 1727204624.16711: variable 'lsr_setup' from source: include params 44071 1727204624.16775: variable 'omit' from source: magic vars 44071 1727204624.16900: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204624.16910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204624.16923: variable 'omit' from source: magic vars 44071 1727204624.17123: variable 'ansible_distribution_major_version' from source: facts 44071 1727204624.17133: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204624.17138: variable 'item' from source: unknown 44071 1727204624.17196: variable 'item' from source: unknown 44071 1727204624.17226: variable 'item' from source: unknown 44071 1727204624.17277: variable 'item' from source: unknown 44071 1727204624.17421: dumping result to json 44071 1727204624.17424: done dumping result, returning 44071 1727204624.17426: done running TaskExecutor() for managed-node2/TASK: Setup [127b8e07-fff9-c964-7471-000000000a4c] 44071 1727204624.17428: sending task result for task 127b8e07-fff9-c964-7471-000000000a4c 44071 1727204624.17501: no more pending results, returning what we have 44071 1727204624.17507: in VariableManager get_vars() 44071 1727204624.17550: Calling all_inventory to load vars for managed-node2 44071 1727204624.17553: Calling groups_inventory to load vars for managed-node2 44071 1727204624.17556: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204624.17571: Calling all_plugins_play to load vars for managed-node2 44071 1727204624.17574: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204624.17584: Calling groups_plugins_play to load vars for managed-node2 44071 1727204624.18186: done sending task result for task 127b8e07-fff9-c964-7471-000000000a4c 44071 1727204624.18191: WORKER PROCESS EXITING 44071 1727204624.18716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204624.21249: done with get_vars() 44071 1727204624.21287: variable 'ansible_search_path' from source: unknown 44071 1727204624.21289: variable 'ansible_search_path' from source: unknown 44071 1727204624.21338: we have included files to process 44071 1727204624.21342: generating all_blocks data 44071 1727204624.21344: done generating all_blocks data 44071 1727204624.21349: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 44071 1727204624.21350: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 44071 1727204624.21354: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 44071 1727204624.21634: done processing included file 44071 1727204624.21636: iterating over new_blocks loaded from include file 44071 1727204624.21638: in VariableManager get_vars() 44071 1727204624.21658: done with get_vars() 44071 1727204624.21661: filtering new block on tags 44071 1727204624.21702: done filtering new block on tags 44071 1727204624.21704: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed-node2 => (item=tasks/create_bridge_profile.yml) 44071 1727204624.21710: extending task lists for all hosts with included blocks 44071 1727204624.22387: done extending task lists 44071 1727204624.22390: done processing included files 44071 1727204624.22390: results queue empty 44071 1727204624.22391: checking for any_errors_fatal 44071 1727204624.22396: done checking for any_errors_fatal 44071 1727204624.22397: checking for max_fail_percentage 44071 1727204624.22398: done checking for max_fail_percentage 44071 1727204624.22399: checking to see if all hosts have failed and the running result is not ok 44071 1727204624.22399: done checking to see if all hosts have failed 44071 1727204624.22400: getting the remaining hosts for this loop 44071 1727204624.22402: done getting the remaining hosts for this loop 44071 1727204624.22405: getting the next task for host managed-node2 44071 1727204624.22410: done getting next task for host managed-node2 44071 1727204624.22412: ^ task is: TASK: Include network role 44071 1727204624.22415: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204624.22418: getting variables 44071 1727204624.22419: in VariableManager get_vars() 44071 1727204624.22434: Calling all_inventory to load vars for managed-node2 44071 1727204624.22437: Calling groups_inventory to load vars for managed-node2 44071 1727204624.22441: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204624.22450: Calling all_plugins_play to load vars for managed-node2 44071 1727204624.22452: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204624.22455: Calling groups_plugins_play to load vars for managed-node2 44071 1727204624.24080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204624.26504: done with get_vars() 44071 1727204624.26548: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.111) 0:00:36.582 ***** 44071 1727204624.26637: entering _queue_task() for managed-node2/include_role 44071 1727204624.27037: worker is 1 (out of 1 available) 44071 1727204624.27055: exiting _queue_task() for managed-node2/include_role 44071 1727204624.27075: done queuing things up, now waiting for results queue to drain 44071 1727204624.27077: waiting for pending results... 44071 1727204624.27887: running TaskExecutor() for managed-node2/TASK: Include network role 44071 1727204624.28054: in run() - task 127b8e07-fff9-c964-7471-000000000ad1 44071 1727204624.28142: variable 'ansible_search_path' from source: unknown 44071 1727204624.28154: variable 'ansible_search_path' from source: unknown 44071 1727204624.28206: calling self._execute() 44071 1727204624.28500: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204624.28513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204624.28528: variable 'omit' from source: magic vars 44071 1727204624.29647: variable 'ansible_distribution_major_version' from source: facts 44071 1727204624.29652: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204624.29654: _execute() done 44071 1727204624.29657: dumping result to json 44071 1727204624.29660: done dumping result, returning 44071 1727204624.29663: done running TaskExecutor() for managed-node2/TASK: Include network role [127b8e07-fff9-c964-7471-000000000ad1] 44071 1727204624.29666: sending task result for task 127b8e07-fff9-c964-7471-000000000ad1 44071 1727204624.29994: done sending task result for task 127b8e07-fff9-c964-7471-000000000ad1 44071 1727204624.29998: WORKER PROCESS EXITING 44071 1727204624.30036: no more pending results, returning what we have 44071 1727204624.30042: in VariableManager get_vars() 44071 1727204624.30092: Calling all_inventory to load vars for managed-node2 44071 1727204624.30095: Calling groups_inventory to load vars for managed-node2 44071 1727204624.30099: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204624.30179: Calling all_plugins_play to load vars for managed-node2 44071 1727204624.30184: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204624.30188: Calling groups_plugins_play to load vars for managed-node2 44071 1727204624.32464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204624.35958: done with get_vars() 44071 1727204624.35996: variable 'ansible_search_path' from source: unknown 44071 1727204624.35998: variable 'ansible_search_path' from source: unknown 44071 1727204624.36232: variable 'omit' from source: magic vars 44071 1727204624.36284: variable 'omit' from source: magic vars 44071 1727204624.36303: variable 'omit' from source: magic vars 44071 1727204624.36307: we have included files to process 44071 1727204624.36308: generating all_blocks data 44071 1727204624.36310: done generating all_blocks data 44071 1727204624.36311: processing included file: fedora.linux_system_roles.network 44071 1727204624.36334: in VariableManager get_vars() 44071 1727204624.36352: done with get_vars() 44071 1727204624.36390: in VariableManager get_vars() 44071 1727204624.36411: done with get_vars() 44071 1727204624.36456: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44071 1727204624.36602: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44071 1727204624.36698: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44071 1727204624.37251: in VariableManager get_vars() 44071 1727204624.37277: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204624.38781: iterating over new_blocks loaded from include file 44071 1727204624.38783: in VariableManager get_vars() 44071 1727204624.38799: done with get_vars() 44071 1727204624.38800: filtering new block on tags 44071 1727204624.38992: done filtering new block on tags 44071 1727204624.38996: in VariableManager get_vars() 44071 1727204624.39007: done with get_vars() 44071 1727204624.39008: filtering new block on tags 44071 1727204624.39020: done filtering new block on tags 44071 1727204624.39021: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 44071 1727204624.39025: extending task lists for all hosts with included blocks 44071 1727204624.39139: done extending task lists 44071 1727204624.39140: done processing included files 44071 1727204624.39140: results queue empty 44071 1727204624.39141: checking for any_errors_fatal 44071 1727204624.39145: done checking for any_errors_fatal 44071 1727204624.39145: checking for max_fail_percentage 44071 1727204624.39146: done checking for max_fail_percentage 44071 1727204624.39147: checking to see if all hosts have failed and the running result is not ok 44071 1727204624.39148: done checking to see if all hosts have failed 44071 1727204624.39148: getting the remaining hosts for this loop 44071 1727204624.39149: done getting the remaining hosts for this loop 44071 1727204624.39151: getting the next task for host managed-node2 44071 1727204624.39154: done getting next task for host managed-node2 44071 1727204624.39156: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204624.39159: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204624.39169: getting variables 44071 1727204624.39170: in VariableManager get_vars() 44071 1727204624.39180: Calling all_inventory to load vars for managed-node2 44071 1727204624.39182: Calling groups_inventory to load vars for managed-node2 44071 1727204624.39183: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204624.39188: Calling all_plugins_play to load vars for managed-node2 44071 1727204624.39190: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204624.39192: Calling groups_plugins_play to load vars for managed-node2 44071 1727204624.40724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204624.41960: done with get_vars() 44071 1727204624.41993: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.154) 0:00:36.737 ***** 44071 1727204624.42068: entering _queue_task() for managed-node2/include_tasks 44071 1727204624.42381: worker is 1 (out of 1 available) 44071 1727204624.42396: exiting _queue_task() for managed-node2/include_tasks 44071 1727204624.42410: done queuing things up, now waiting for results queue to drain 44071 1727204624.42412: waiting for pending results... 44071 1727204624.42611: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204624.42721: in run() - task 127b8e07-fff9-c964-7471-000000000b33 44071 1727204624.42736: variable 'ansible_search_path' from source: unknown 44071 1727204624.42743: variable 'ansible_search_path' from source: unknown 44071 1727204624.42789: calling self._execute() 44071 1727204624.43072: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204624.43076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204624.43079: variable 'omit' from source: magic vars 44071 1727204624.43330: variable 'ansible_distribution_major_version' from source: facts 44071 1727204624.43352: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204624.43364: _execute() done 44071 1727204624.43376: dumping result to json 44071 1727204624.43385: done dumping result, returning 44071 1727204624.43397: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-c964-7471-000000000b33] 44071 1727204624.43407: sending task result for task 127b8e07-fff9-c964-7471-000000000b33 44071 1727204624.43521: done sending task result for task 127b8e07-fff9-c964-7471-000000000b33 44071 1727204624.43588: no more pending results, returning what we have 44071 1727204624.43596: in VariableManager get_vars() 44071 1727204624.43767: Calling all_inventory to load vars for managed-node2 44071 1727204624.43778: Calling groups_inventory to load vars for managed-node2 44071 1727204624.43781: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204624.43800: Calling all_plugins_play to load vars for managed-node2 44071 1727204624.43804: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204624.43809: Calling groups_plugins_play to load vars for managed-node2 44071 1727204624.44351: WORKER PROCESS EXITING 44071 1727204624.45413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204624.46886: done with get_vars() 44071 1727204624.46927: variable 'ansible_search_path' from source: unknown 44071 1727204624.46928: variable 'ansible_search_path' from source: unknown 44071 1727204624.46977: we have included files to process 44071 1727204624.46979: generating all_blocks data 44071 1727204624.46981: done generating all_blocks data 44071 1727204624.46985: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204624.46986: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204624.46990: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204624.47589: done processing included file 44071 1727204624.47591: iterating over new_blocks loaded from include file 44071 1727204624.47593: in VariableManager get_vars() 44071 1727204624.47623: done with get_vars() 44071 1727204624.47625: filtering new block on tags 44071 1727204624.47659: done filtering new block on tags 44071 1727204624.47663: in VariableManager get_vars() 44071 1727204624.47690: done with get_vars() 44071 1727204624.47693: filtering new block on tags 44071 1727204624.47742: done filtering new block on tags 44071 1727204624.47745: in VariableManager get_vars() 44071 1727204624.47772: done with get_vars() 44071 1727204624.47774: filtering new block on tags 44071 1727204624.47822: done filtering new block on tags 44071 1727204624.47825: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 44071 1727204624.47831: extending task lists for all hosts with included blocks 44071 1727204624.49343: done extending task lists 44071 1727204624.49345: done processing included files 44071 1727204624.49346: results queue empty 44071 1727204624.49347: checking for any_errors_fatal 44071 1727204624.49349: done checking for any_errors_fatal 44071 1727204624.49350: checking for max_fail_percentage 44071 1727204624.49351: done checking for max_fail_percentage 44071 1727204624.49351: checking to see if all hosts have failed and the running result is not ok 44071 1727204624.49352: done checking to see if all hosts have failed 44071 1727204624.49352: getting the remaining hosts for this loop 44071 1727204624.49353: done getting the remaining hosts for this loop 44071 1727204624.49355: getting the next task for host managed-node2 44071 1727204624.49359: done getting next task for host managed-node2 44071 1727204624.49362: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204624.49366: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204624.49376: getting variables 44071 1727204624.49377: in VariableManager get_vars() 44071 1727204624.49391: Calling all_inventory to load vars for managed-node2 44071 1727204624.49392: Calling groups_inventory to load vars for managed-node2 44071 1727204624.49394: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204624.49398: Calling all_plugins_play to load vars for managed-node2 44071 1727204624.49400: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204624.49406: Calling groups_plugins_play to load vars for managed-node2 44071 1727204624.50285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204624.52027: done with get_vars() 44071 1727204624.52058: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.100) 0:00:36.837 ***** 44071 1727204624.52128: entering _queue_task() for managed-node2/setup 44071 1727204624.52437: worker is 1 (out of 1 available) 44071 1727204624.52453: exiting _queue_task() for managed-node2/setup 44071 1727204624.52470: done queuing things up, now waiting for results queue to drain 44071 1727204624.52472: waiting for pending results... 44071 1727204624.52682: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204624.52795: in run() - task 127b8e07-fff9-c964-7471-000000000b90 44071 1727204624.52810: variable 'ansible_search_path' from source: unknown 44071 1727204624.52816: variable 'ansible_search_path' from source: unknown 44071 1727204624.52855: calling self._execute() 44071 1727204624.52934: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204624.52941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204624.52953: variable 'omit' from source: magic vars 44071 1727204624.53277: variable 'ansible_distribution_major_version' from source: facts 44071 1727204624.53288: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204624.53468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204624.55325: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204624.55380: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204624.55410: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204624.55442: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204624.55467: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204624.55537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204624.55561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204624.55584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204624.55613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204624.55625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204624.55680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204624.55698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204624.55716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204624.55743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204624.55759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204624.55893: variable '__network_required_facts' from source: role '' defaults 44071 1727204624.55902: variable 'ansible_facts' from source: unknown 44071 1727204624.56505: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44071 1727204624.56510: when evaluation is False, skipping this task 44071 1727204624.56513: _execute() done 44071 1727204624.56516: dumping result to json 44071 1727204624.56521: done dumping result, returning 44071 1727204624.56524: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-c964-7471-000000000b90] 44071 1727204624.56527: sending task result for task 127b8e07-fff9-c964-7471-000000000b90 44071 1727204624.56630: done sending task result for task 127b8e07-fff9-c964-7471-000000000b90 44071 1727204624.56634: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204624.56692: no more pending results, returning what we have 44071 1727204624.56697: results queue empty 44071 1727204624.56698: checking for any_errors_fatal 44071 1727204624.56699: done checking for any_errors_fatal 44071 1727204624.56700: checking for max_fail_percentage 44071 1727204624.56702: done checking for max_fail_percentage 44071 1727204624.56703: checking to see if all hosts have failed and the running result is not ok 44071 1727204624.56703: done checking to see if all hosts have failed 44071 1727204624.56704: getting the remaining hosts for this loop 44071 1727204624.56706: done getting the remaining hosts for this loop 44071 1727204624.56711: getting the next task for host managed-node2 44071 1727204624.56723: done getting next task for host managed-node2 44071 1727204624.56727: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204624.56734: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204624.56762: getting variables 44071 1727204624.56764: in VariableManager get_vars() 44071 1727204624.56805: Calling all_inventory to load vars for managed-node2 44071 1727204624.56808: Calling groups_inventory to load vars for managed-node2 44071 1727204624.56810: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204624.56822: Calling all_plugins_play to load vars for managed-node2 44071 1727204624.56826: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204624.56835: Calling groups_plugins_play to load vars for managed-node2 44071 1727204624.58094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204624.59326: done with get_vars() 44071 1727204624.59359: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.073) 0:00:36.910 ***** 44071 1727204624.59451: entering _queue_task() for managed-node2/stat 44071 1727204624.59752: worker is 1 (out of 1 available) 44071 1727204624.59769: exiting _queue_task() for managed-node2/stat 44071 1727204624.59784: done queuing things up, now waiting for results queue to drain 44071 1727204624.59786: waiting for pending results... 44071 1727204624.59987: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204624.60111: in run() - task 127b8e07-fff9-c964-7471-000000000b92 44071 1727204624.60129: variable 'ansible_search_path' from source: unknown 44071 1727204624.60133: variable 'ansible_search_path' from source: unknown 44071 1727204624.60167: calling self._execute() 44071 1727204624.60252: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204624.60471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204624.60476: variable 'omit' from source: magic vars 44071 1727204624.60736: variable 'ansible_distribution_major_version' from source: facts 44071 1727204624.60759: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204624.60978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204624.61319: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204624.61388: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204624.61430: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204624.61483: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204624.61597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204624.61631: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204624.61671: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204624.61715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204624.61838: variable '__network_is_ostree' from source: set_fact 44071 1727204624.61855: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204624.61863: when evaluation is False, skipping this task 44071 1727204624.61874: _execute() done 44071 1727204624.61883: dumping result to json 44071 1727204624.61891: done dumping result, returning 44071 1727204624.61914: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-c964-7471-000000000b92] 44071 1727204624.61925: sending task result for task 127b8e07-fff9-c964-7471-000000000b92 44071 1727204624.62085: done sending task result for task 127b8e07-fff9-c964-7471-000000000b92 44071 1727204624.62090: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204624.62153: no more pending results, returning what we have 44071 1727204624.62159: results queue empty 44071 1727204624.62160: checking for any_errors_fatal 44071 1727204624.62172: done checking for any_errors_fatal 44071 1727204624.62173: checking for max_fail_percentage 44071 1727204624.62175: done checking for max_fail_percentage 44071 1727204624.62175: checking to see if all hosts have failed and the running result is not ok 44071 1727204624.62176: done checking to see if all hosts have failed 44071 1727204624.62177: getting the remaining hosts for this loop 44071 1727204624.62179: done getting the remaining hosts for this loop 44071 1727204624.62183: getting the next task for host managed-node2 44071 1727204624.62192: done getting next task for host managed-node2 44071 1727204624.62196: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204624.62203: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204624.62223: getting variables 44071 1727204624.62225: in VariableManager get_vars() 44071 1727204624.62271: Calling all_inventory to load vars for managed-node2 44071 1727204624.62274: Calling groups_inventory to load vars for managed-node2 44071 1727204624.62276: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204624.62287: Calling all_plugins_play to load vars for managed-node2 44071 1727204624.62289: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204624.62292: Calling groups_plugins_play to load vars for managed-node2 44071 1727204624.63418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204624.65243: done with get_vars() 44071 1727204624.65292: done getting variables 44071 1727204624.65368: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.059) 0:00:36.970 ***** 44071 1727204624.65415: entering _queue_task() for managed-node2/set_fact 44071 1727204624.65831: worker is 1 (out of 1 available) 44071 1727204624.65849: exiting _queue_task() for managed-node2/set_fact 44071 1727204624.65870: done queuing things up, now waiting for results queue to drain 44071 1727204624.65872: waiting for pending results... 44071 1727204624.66383: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204624.66440: in run() - task 127b8e07-fff9-c964-7471-000000000b93 44071 1727204624.66468: variable 'ansible_search_path' from source: unknown 44071 1727204624.66478: variable 'ansible_search_path' from source: unknown 44071 1727204624.66525: calling self._execute() 44071 1727204624.66628: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204624.66641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204624.66772: variable 'omit' from source: magic vars 44071 1727204624.67063: variable 'ansible_distribution_major_version' from source: facts 44071 1727204624.67086: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204624.67273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204624.67567: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204624.67625: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204624.67669: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204624.67710: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204624.67809: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204624.67840: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204624.67875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204624.67908: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204624.68016: variable '__network_is_ostree' from source: set_fact 44071 1727204624.68030: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204624.68038: when evaluation is False, skipping this task 44071 1727204624.68045: _execute() done 44071 1727204624.68053: dumping result to json 44071 1727204624.68060: done dumping result, returning 44071 1727204624.68074: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-000000000b93] 44071 1727204624.68084: sending task result for task 127b8e07-fff9-c964-7471-000000000b93 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204624.68260: no more pending results, returning what we have 44071 1727204624.68265: results queue empty 44071 1727204624.68268: checking for any_errors_fatal 44071 1727204624.68277: done checking for any_errors_fatal 44071 1727204624.68277: checking for max_fail_percentage 44071 1727204624.68279: done checking for max_fail_percentage 44071 1727204624.68280: checking to see if all hosts have failed and the running result is not ok 44071 1727204624.68280: done checking to see if all hosts have failed 44071 1727204624.68281: getting the remaining hosts for this loop 44071 1727204624.68283: done getting the remaining hosts for this loop 44071 1727204624.68288: getting the next task for host managed-node2 44071 1727204624.68300: done getting next task for host managed-node2 44071 1727204624.68304: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204624.68311: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204624.68333: getting variables 44071 1727204624.68335: in VariableManager get_vars() 44071 1727204624.68585: Calling all_inventory to load vars for managed-node2 44071 1727204624.68588: Calling groups_inventory to load vars for managed-node2 44071 1727204624.68591: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204624.68599: done sending task result for task 127b8e07-fff9-c964-7471-000000000b93 44071 1727204624.68602: WORKER PROCESS EXITING 44071 1727204624.68613: Calling all_plugins_play to load vars for managed-node2 44071 1727204624.68617: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204624.68620: Calling groups_plugins_play to load vars for managed-node2 44071 1727204624.70313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204624.72483: done with get_vars() 44071 1727204624.72523: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:44 -0400 (0:00:00.072) 0:00:37.042 ***** 44071 1727204624.72635: entering _queue_task() for managed-node2/service_facts 44071 1727204624.73043: worker is 1 (out of 1 available) 44071 1727204624.73057: exiting _queue_task() for managed-node2/service_facts 44071 1727204624.73078: done queuing things up, now waiting for results queue to drain 44071 1727204624.73079: waiting for pending results... 44071 1727204624.73425: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204624.73628: in run() - task 127b8e07-fff9-c964-7471-000000000b95 44071 1727204624.73737: variable 'ansible_search_path' from source: unknown 44071 1727204624.73784: variable 'ansible_search_path' from source: unknown 44071 1727204624.73885: calling self._execute() 44071 1727204624.74136: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204624.74152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204624.74174: variable 'omit' from source: magic vars 44071 1727204624.75052: variable 'ansible_distribution_major_version' from source: facts 44071 1727204624.75080: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204624.75122: variable 'omit' from source: magic vars 44071 1727204624.75234: variable 'omit' from source: magic vars 44071 1727204624.75287: variable 'omit' from source: magic vars 44071 1727204624.75350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204624.75396: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204624.75425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204624.75451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204624.75479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204624.75519: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204624.75528: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204624.75538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204624.75648: Set connection var ansible_connection to ssh 44071 1727204624.75683: Set connection var ansible_timeout to 10 44071 1727204624.75686: Set connection var ansible_pipelining to False 44071 1727204624.75689: Set connection var ansible_shell_type to sh 44071 1727204624.75692: Set connection var ansible_shell_executable to /bin/sh 44071 1727204624.75791: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204624.75794: variable 'ansible_shell_executable' from source: unknown 44071 1727204624.75797: variable 'ansible_connection' from source: unknown 44071 1727204624.75800: variable 'ansible_module_compression' from source: unknown 44071 1727204624.75803: variable 'ansible_shell_type' from source: unknown 44071 1727204624.75805: variable 'ansible_shell_executable' from source: unknown 44071 1727204624.75806: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204624.75809: variable 'ansible_pipelining' from source: unknown 44071 1727204624.75810: variable 'ansible_timeout' from source: unknown 44071 1727204624.75812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204624.75995: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204624.76018: variable 'omit' from source: magic vars 44071 1727204624.76028: starting attempt loop 44071 1727204624.76036: running the handler 44071 1727204624.76056: _low_level_execute_command(): starting 44071 1727204624.76073: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204624.76911: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204624.76958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204624.76988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204624.77100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204624.77322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204624.79034: stdout chunk (state=3): >>>/root <<< 44071 1727204624.79147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204624.79338: stderr chunk (state=3): >>><<< 44071 1727204624.79476: stdout chunk (state=3): >>><<< 44071 1727204624.79495: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204624.79515: _low_level_execute_command(): starting 44071 1727204624.79526: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204624.795013-46255-261308447204059 `" && echo ansible-tmp-1727204624.795013-46255-261308447204059="` echo /root/.ansible/tmp/ansible-tmp-1727204624.795013-46255-261308447204059 `" ) && sleep 0' 44071 1727204624.80736: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204624.80759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204624.80904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204624.80907: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204624.80918: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204624.80920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204624.80974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204624.80978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204624.81110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204624.81201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204624.83199: stdout chunk (state=3): >>>ansible-tmp-1727204624.795013-46255-261308447204059=/root/.ansible/tmp/ansible-tmp-1727204624.795013-46255-261308447204059 <<< 44071 1727204624.83407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204624.83772: stderr chunk (state=3): >>><<< 44071 1727204624.83776: stdout chunk (state=3): >>><<< 44071 1727204624.83778: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204624.795013-46255-261308447204059=/root/.ansible/tmp/ansible-tmp-1727204624.795013-46255-261308447204059 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204624.83780: variable 'ansible_module_compression' from source: unknown 44071 1727204624.83782: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44071 1727204624.83784: variable 'ansible_facts' from source: unknown 44071 1727204624.84136: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204624.795013-46255-261308447204059/AnsiballZ_service_facts.py 44071 1727204624.84396: Sending initial data 44071 1727204624.84399: Sent initial data (161 bytes) 44071 1727204624.85721: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204624.85743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204624.85959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204624.87585: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204624.87636: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204624.87724: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp7pv4nvzm /root/.ansible/tmp/ansible-tmp-1727204624.795013-46255-261308447204059/AnsiballZ_service_facts.py <<< 44071 1727204624.87728: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204624.795013-46255-261308447204059/AnsiballZ_service_facts.py" <<< 44071 1727204624.87810: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp7pv4nvzm" to remote "/root/.ansible/tmp/ansible-tmp-1727204624.795013-46255-261308447204059/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204624.795013-46255-261308447204059/AnsiballZ_service_facts.py" <<< 44071 1727204624.89392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204624.89612: stderr chunk (state=3): >>><<< 44071 1727204624.89692: stdout chunk (state=3): >>><<< 44071 1727204624.89712: done transferring module to remote 44071 1727204624.89726: _low_level_execute_command(): starting 44071 1727204624.89774: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204624.795013-46255-261308447204059/ /root/.ansible/tmp/ansible-tmp-1727204624.795013-46255-261308447204059/AnsiballZ_service_facts.py && sleep 0' 44071 1727204624.91219: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204624.91335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204624.91342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204624.91608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204624.91749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204624.93554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204624.93563: stderr chunk (state=3): >>><<< 44071 1727204624.93566: stdout chunk (state=3): >>><<< 44071 1727204624.93583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204624.93586: _low_level_execute_command(): starting 44071 1727204624.93591: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204624.795013-46255-261308447204059/AnsiballZ_service_facts.py && sleep 0' 44071 1727204624.95275: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204624.95280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204624.95369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204624.95386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204624.95398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204624.95483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204624.95646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204627.17497: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "s<<< 44071 1727204627.17557: stdout chunk (state=3): >>>topped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", <<< 44071 1727204627.17576: stdout chunk (state=3): >>>"status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt<<< 44071 1727204627.17579: stdout chunk (state=3): >>>.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44071 1727204627.19277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204627.19297: stderr chunk (state=3): >>><<< 44071 1727204627.19306: stdout chunk (state=3): >>><<< 44071 1727204627.19338: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204627.28323: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204624.795013-46255-261308447204059/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204627.28327: _low_level_execute_command(): starting 44071 1727204627.28330: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204624.795013-46255-261308447204059/ > /dev/null 2>&1 && sleep 0' 44071 1727204627.29174: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204627.29179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204627.29182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204627.29184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204627.29187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204627.29189: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204627.29191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204627.29309: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204627.29601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204627.29801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204627.31815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204627.31820: stdout chunk (state=3): >>><<< 44071 1727204627.31822: stderr chunk (state=3): >>><<< 44071 1727204627.31850: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204627.31854: handler run complete 44071 1727204627.32477: variable 'ansible_facts' from source: unknown 44071 1727204627.32829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204627.33711: variable 'ansible_facts' from source: unknown 44071 1727204627.33896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204627.34163: attempt loop complete, returning result 44071 1727204627.34247: _execute() done 44071 1727204627.34251: dumping result to json 44071 1727204627.34253: done dumping result, returning 44071 1727204627.34256: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-c964-7471-000000000b95] 44071 1727204627.34258: sending task result for task 127b8e07-fff9-c964-7471-000000000b95 44071 1727204627.42151: done sending task result for task 127b8e07-fff9-c964-7471-000000000b95 44071 1727204627.42155: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204627.42250: no more pending results, returning what we have 44071 1727204627.42253: results queue empty 44071 1727204627.42254: checking for any_errors_fatal 44071 1727204627.42257: done checking for any_errors_fatal 44071 1727204627.42258: checking for max_fail_percentage 44071 1727204627.42259: done checking for max_fail_percentage 44071 1727204627.42260: checking to see if all hosts have failed and the running result is not ok 44071 1727204627.42261: done checking to see if all hosts have failed 44071 1727204627.42262: getting the remaining hosts for this loop 44071 1727204627.42263: done getting the remaining hosts for this loop 44071 1727204627.42270: getting the next task for host managed-node2 44071 1727204627.42276: done getting next task for host managed-node2 44071 1727204627.42279: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204627.42289: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204627.42302: getting variables 44071 1727204627.42303: in VariableManager get_vars() 44071 1727204627.42324: Calling all_inventory to load vars for managed-node2 44071 1727204627.42327: Calling groups_inventory to load vars for managed-node2 44071 1727204627.42329: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204627.42336: Calling all_plugins_play to load vars for managed-node2 44071 1727204627.42347: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204627.42351: Calling groups_plugins_play to load vars for managed-node2 44071 1727204627.44132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204627.46657: done with get_vars() 44071 1727204627.46802: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:47 -0400 (0:00:02.743) 0:00:39.786 ***** 44071 1727204627.47015: entering _queue_task() for managed-node2/package_facts 44071 1727204627.48011: worker is 1 (out of 1 available) 44071 1727204627.48026: exiting _queue_task() for managed-node2/package_facts 44071 1727204627.48039: done queuing things up, now waiting for results queue to drain 44071 1727204627.48043: waiting for pending results... 44071 1727204627.48487: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204627.48924: in run() - task 127b8e07-fff9-c964-7471-000000000b96 44071 1727204627.48960: variable 'ansible_search_path' from source: unknown 44071 1727204627.49049: variable 'ansible_search_path' from source: unknown 44071 1727204627.49053: calling self._execute() 44071 1727204627.49138: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204627.49158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204627.49176: variable 'omit' from source: magic vars 44071 1727204627.50031: variable 'ansible_distribution_major_version' from source: facts 44071 1727204627.50059: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204627.50171: variable 'omit' from source: magic vars 44071 1727204627.50452: variable 'omit' from source: magic vars 44071 1727204627.50455: variable 'omit' from source: magic vars 44071 1727204627.50576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204627.50624: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204627.50701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204627.50873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204627.50878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204627.50881: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204627.50883: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204627.50886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204627.51273: Set connection var ansible_connection to ssh 44071 1727204627.51278: Set connection var ansible_timeout to 10 44071 1727204627.51281: Set connection var ansible_pipelining to False 44071 1727204627.51283: Set connection var ansible_shell_type to sh 44071 1727204627.51286: Set connection var ansible_shell_executable to /bin/sh 44071 1727204627.51288: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204627.51344: variable 'ansible_shell_executable' from source: unknown 44071 1727204627.51550: variable 'ansible_connection' from source: unknown 44071 1727204627.51554: variable 'ansible_module_compression' from source: unknown 44071 1727204627.51557: variable 'ansible_shell_type' from source: unknown 44071 1727204627.51559: variable 'ansible_shell_executable' from source: unknown 44071 1727204627.51561: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204627.51564: variable 'ansible_pipelining' from source: unknown 44071 1727204627.51568: variable 'ansible_timeout' from source: unknown 44071 1727204627.51571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204627.51938: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204627.52092: variable 'omit' from source: magic vars 44071 1727204627.52097: starting attempt loop 44071 1727204627.52099: running the handler 44071 1727204627.52102: _low_level_execute_command(): starting 44071 1727204627.52104: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204627.53324: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204627.53451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204627.53514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204627.55236: stdout chunk (state=3): >>>/root <<< 44071 1727204627.55389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204627.55675: stderr chunk (state=3): >>><<< 44071 1727204627.55680: stdout chunk (state=3): >>><<< 44071 1727204627.55684: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204627.55689: _low_level_execute_command(): starting 44071 1727204627.55692: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204627.555197-46423-180529380998837 `" && echo ansible-tmp-1727204627.555197-46423-180529380998837="` echo /root/.ansible/tmp/ansible-tmp-1727204627.555197-46423-180529380998837 `" ) && sleep 0' 44071 1727204627.56695: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204627.56710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204627.56712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204627.56727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204627.56875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204627.58834: stdout chunk (state=3): >>>ansible-tmp-1727204627.555197-46423-180529380998837=/root/.ansible/tmp/ansible-tmp-1727204627.555197-46423-180529380998837 <<< 44071 1727204627.59078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204627.59082: stdout chunk (state=3): >>><<< 44071 1727204627.59089: stderr chunk (state=3): >>><<< 44071 1727204627.59191: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204627.555197-46423-180529380998837=/root/.ansible/tmp/ansible-tmp-1727204627.555197-46423-180529380998837 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204627.59248: variable 'ansible_module_compression' from source: unknown 44071 1727204627.59311: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44071 1727204627.59637: variable 'ansible_facts' from source: unknown 44071 1727204627.59751: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204627.555197-46423-180529380998837/AnsiballZ_package_facts.py 44071 1727204627.60403: Sending initial data 44071 1727204627.60407: Sent initial data (161 bytes) 44071 1727204627.61845: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204627.61897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204627.61904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204627.62047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204627.62051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204627.62054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204627.62238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204627.62308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204627.64032: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204627.64343: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204627.64348: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204627.555197-46423-180529380998837/AnsiballZ_package_facts.py" <<< 44071 1727204627.64486: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp9b4azhfp /root/.ansible/tmp/ansible-tmp-1727204627.555197-46423-180529380998837/AnsiballZ_package_facts.py <<< 44071 1727204627.64646: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp9b4azhfp" to remote "/root/.ansible/tmp/ansible-tmp-1727204627.555197-46423-180529380998837/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204627.555197-46423-180529380998837/AnsiballZ_package_facts.py" <<< 44071 1727204627.70015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204627.70112: stderr chunk (state=3): >>><<< 44071 1727204627.70373: stdout chunk (state=3): >>><<< 44071 1727204627.70377: done transferring module to remote 44071 1727204627.70381: _low_level_execute_command(): starting 44071 1727204627.70384: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204627.555197-46423-180529380998837/ /root/.ansible/tmp/ansible-tmp-1727204627.555197-46423-180529380998837/AnsiballZ_package_facts.py && sleep 0' 44071 1727204627.71546: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204627.71564: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204627.71585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204627.71712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204627.71733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204627.71805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204627.71955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204627.73882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204627.73887: stdout chunk (state=3): >>><<< 44071 1727204627.73899: stderr chunk (state=3): >>><<< 44071 1727204627.74185: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204627.74190: _low_level_execute_command(): starting 44071 1727204627.74197: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204627.555197-46423-180529380998837/AnsiballZ_package_facts.py && sleep 0' 44071 1727204627.75574: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204627.75797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204627.75820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204627.76285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204628.38911: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 44071 1727204628.38933: stdout chunk (state=3): >>>me": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 44071 1727204628.39050: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lib<<< 44071 1727204628.39179: stdout chunk (state=3): >>>xmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 44071 1727204628.39199: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44071 1727204628.41193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204628.41197: stdout chunk (state=3): >>><<< 44071 1727204628.41200: stderr chunk (state=3): >>><<< 44071 1727204628.41250: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204628.48277: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204627.555197-46423-180529380998837/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204628.48282: _low_level_execute_command(): starting 44071 1727204628.48284: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204627.555197-46423-180529380998837/ > /dev/null 2>&1 && sleep 0' 44071 1727204628.49496: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204628.49500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204628.49592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204628.49670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204628.51677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204628.51815: stderr chunk (state=3): >>><<< 44071 1727204628.51826: stdout chunk (state=3): >>><<< 44071 1727204628.51887: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204628.51902: handler run complete 44071 1727204628.54347: variable 'ansible_facts' from source: unknown 44071 1727204628.55922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204628.61557: variable 'ansible_facts' from source: unknown 44071 1727204628.62324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204628.63513: attempt loop complete, returning result 44071 1727204628.63539: _execute() done 44071 1727204628.63545: dumping result to json 44071 1727204628.64019: done dumping result, returning 44071 1727204628.64031: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-c964-7471-000000000b96] 44071 1727204628.64037: sending task result for task 127b8e07-fff9-c964-7471-000000000b96 44071 1727204628.67855: done sending task result for task 127b8e07-fff9-c964-7471-000000000b96 44071 1727204628.67861: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204628.67999: no more pending results, returning what we have 44071 1727204628.68002: results queue empty 44071 1727204628.68003: checking for any_errors_fatal 44071 1727204628.68009: done checking for any_errors_fatal 44071 1727204628.68010: checking for max_fail_percentage 44071 1727204628.68011: done checking for max_fail_percentage 44071 1727204628.68012: checking to see if all hosts have failed and the running result is not ok 44071 1727204628.68013: done checking to see if all hosts have failed 44071 1727204628.68014: getting the remaining hosts for this loop 44071 1727204628.68015: done getting the remaining hosts for this loop 44071 1727204628.68019: getting the next task for host managed-node2 44071 1727204628.68028: done getting next task for host managed-node2 44071 1727204628.68032: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204628.68038: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204628.68055: getting variables 44071 1727204628.68056: in VariableManager get_vars() 44071 1727204628.68093: Calling all_inventory to load vars for managed-node2 44071 1727204628.68096: Calling groups_inventory to load vars for managed-node2 44071 1727204628.68098: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204628.68109: Calling all_plugins_play to load vars for managed-node2 44071 1727204628.68111: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204628.68114: Calling groups_plugins_play to load vars for managed-node2 44071 1727204628.71986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204628.76736: done with get_vars() 44071 1727204628.76785: done getting variables 44071 1727204628.76859: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:48 -0400 (0:00:01.300) 0:00:41.087 ***** 44071 1727204628.77111: entering _queue_task() for managed-node2/debug 44071 1727204628.77919: worker is 1 (out of 1 available) 44071 1727204628.77933: exiting _queue_task() for managed-node2/debug 44071 1727204628.77951: done queuing things up, now waiting for results queue to drain 44071 1727204628.77953: waiting for pending results... 44071 1727204628.78417: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204628.78840: in run() - task 127b8e07-fff9-c964-7471-000000000b34 44071 1727204628.78845: variable 'ansible_search_path' from source: unknown 44071 1727204628.78848: variable 'ansible_search_path' from source: unknown 44071 1727204628.78885: calling self._execute() 44071 1727204628.79167: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204628.79184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204628.79198: variable 'omit' from source: magic vars 44071 1727204628.80162: variable 'ansible_distribution_major_version' from source: facts 44071 1727204628.80185: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204628.80267: variable 'omit' from source: magic vars 44071 1727204628.80346: variable 'omit' from source: magic vars 44071 1727204628.80702: variable 'network_provider' from source: set_fact 44071 1727204628.80729: variable 'omit' from source: magic vars 44071 1727204628.80915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204628.80949: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204628.80979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204628.81004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204628.81035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204628.81170: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204628.81180: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204628.81188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204628.81450: Set connection var ansible_connection to ssh 44071 1727204628.81457: Set connection var ansible_timeout to 10 44071 1727204628.81460: Set connection var ansible_pipelining to False 44071 1727204628.81462: Set connection var ansible_shell_type to sh 44071 1727204628.81563: Set connection var ansible_shell_executable to /bin/sh 44071 1727204628.81574: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204628.81603: variable 'ansible_shell_executable' from source: unknown 44071 1727204628.81607: variable 'ansible_connection' from source: unknown 44071 1727204628.81610: variable 'ansible_module_compression' from source: unknown 44071 1727204628.81613: variable 'ansible_shell_type' from source: unknown 44071 1727204628.81616: variable 'ansible_shell_executable' from source: unknown 44071 1727204628.81618: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204628.81621: variable 'ansible_pipelining' from source: unknown 44071 1727204628.81626: variable 'ansible_timeout' from source: unknown 44071 1727204628.81629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204628.82011: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204628.82020: variable 'omit' from source: magic vars 44071 1727204628.82027: starting attempt loop 44071 1727204628.82030: running the handler 44071 1727204628.82226: handler run complete 44071 1727204628.82238: attempt loop complete, returning result 44071 1727204628.82243: _execute() done 44071 1727204628.82247: dumping result to json 44071 1727204628.82249: done dumping result, returning 44071 1727204628.82256: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-c964-7471-000000000b34] 44071 1727204628.82261: sending task result for task 127b8e07-fff9-c964-7471-000000000b34 44071 1727204628.82522: done sending task result for task 127b8e07-fff9-c964-7471-000000000b34 44071 1727204628.82526: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 44071 1727204628.82603: no more pending results, returning what we have 44071 1727204628.82607: results queue empty 44071 1727204628.82608: checking for any_errors_fatal 44071 1727204628.82619: done checking for any_errors_fatal 44071 1727204628.82620: checking for max_fail_percentage 44071 1727204628.82622: done checking for max_fail_percentage 44071 1727204628.82622: checking to see if all hosts have failed and the running result is not ok 44071 1727204628.82623: done checking to see if all hosts have failed 44071 1727204628.82624: getting the remaining hosts for this loop 44071 1727204628.82625: done getting the remaining hosts for this loop 44071 1727204628.82630: getting the next task for host managed-node2 44071 1727204628.82638: done getting next task for host managed-node2 44071 1727204628.82646: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204628.82652: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204628.82668: getting variables 44071 1727204628.82670: in VariableManager get_vars() 44071 1727204628.82711: Calling all_inventory to load vars for managed-node2 44071 1727204628.82714: Calling groups_inventory to load vars for managed-node2 44071 1727204628.82716: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204628.82728: Calling all_plugins_play to load vars for managed-node2 44071 1727204628.82731: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204628.82734: Calling groups_plugins_play to load vars for managed-node2 44071 1727204628.86593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204628.91116: done with get_vars() 44071 1727204628.91163: done getting variables 44071 1727204628.91437: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:48 -0400 (0:00:00.143) 0:00:41.231 ***** 44071 1727204628.91495: entering _queue_task() for managed-node2/fail 44071 1727204628.92312: worker is 1 (out of 1 available) 44071 1727204628.92328: exiting _queue_task() for managed-node2/fail 44071 1727204628.92345: done queuing things up, now waiting for results queue to drain 44071 1727204628.92347: waiting for pending results... 44071 1727204628.92788: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204628.92918: in run() - task 127b8e07-fff9-c964-7471-000000000b35 44071 1727204628.92944: variable 'ansible_search_path' from source: unknown 44071 1727204628.92954: variable 'ansible_search_path' from source: unknown 44071 1727204628.93016: calling self._execute() 44071 1727204628.93129: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204628.93145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204628.93160: variable 'omit' from source: magic vars 44071 1727204628.93612: variable 'ansible_distribution_major_version' from source: facts 44071 1727204628.93642: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204628.93858: variable 'network_state' from source: role '' defaults 44071 1727204628.93862: Evaluated conditional (network_state != {}): False 44071 1727204628.93866: when evaluation is False, skipping this task 44071 1727204628.93869: _execute() done 44071 1727204628.93872: dumping result to json 44071 1727204628.93874: done dumping result, returning 44071 1727204628.93878: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-c964-7471-000000000b35] 44071 1727204628.93881: sending task result for task 127b8e07-fff9-c964-7471-000000000b35 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204628.94020: no more pending results, returning what we have 44071 1727204628.94025: results queue empty 44071 1727204628.94026: checking for any_errors_fatal 44071 1727204628.94036: done checking for any_errors_fatal 44071 1727204628.94036: checking for max_fail_percentage 44071 1727204628.94038: done checking for max_fail_percentage 44071 1727204628.94040: checking to see if all hosts have failed and the running result is not ok 44071 1727204628.94043: done checking to see if all hosts have failed 44071 1727204628.94043: getting the remaining hosts for this loop 44071 1727204628.94045: done getting the remaining hosts for this loop 44071 1727204628.94050: getting the next task for host managed-node2 44071 1727204628.94060: done getting next task for host managed-node2 44071 1727204628.94064: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204628.94071: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204628.94100: getting variables 44071 1727204628.94103: in VariableManager get_vars() 44071 1727204628.94475: Calling all_inventory to load vars for managed-node2 44071 1727204628.94479: Calling groups_inventory to load vars for managed-node2 44071 1727204628.94482: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204628.94494: Calling all_plugins_play to load vars for managed-node2 44071 1727204628.94497: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204628.94500: Calling groups_plugins_play to load vars for managed-node2 44071 1727204628.95219: done sending task result for task 127b8e07-fff9-c964-7471-000000000b35 44071 1727204628.95224: WORKER PROCESS EXITING 44071 1727204628.96488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204628.98732: done with get_vars() 44071 1727204628.98778: done getting variables 44071 1727204628.98850: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:48 -0400 (0:00:00.073) 0:00:41.305 ***** 44071 1727204628.98896: entering _queue_task() for managed-node2/fail 44071 1727204628.99295: worker is 1 (out of 1 available) 44071 1727204628.99310: exiting _queue_task() for managed-node2/fail 44071 1727204628.99326: done queuing things up, now waiting for results queue to drain 44071 1727204628.99327: waiting for pending results... 44071 1727204628.99684: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204628.99883: in run() - task 127b8e07-fff9-c964-7471-000000000b36 44071 1727204628.99914: variable 'ansible_search_path' from source: unknown 44071 1727204628.99923: variable 'ansible_search_path' from source: unknown 44071 1727204628.99976: calling self._execute() 44071 1727204629.00092: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204629.00107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204629.00126: variable 'omit' from source: magic vars 44071 1727204629.00586: variable 'ansible_distribution_major_version' from source: facts 44071 1727204629.00606: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204629.00754: variable 'network_state' from source: role '' defaults 44071 1727204629.00777: Evaluated conditional (network_state != {}): False 44071 1727204629.00786: when evaluation is False, skipping this task 44071 1727204629.00793: _execute() done 44071 1727204629.00801: dumping result to json 44071 1727204629.00808: done dumping result, returning 44071 1727204629.00820: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-c964-7471-000000000b36] 44071 1727204629.00831: sending task result for task 127b8e07-fff9-c964-7471-000000000b36 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204629.01012: no more pending results, returning what we have 44071 1727204629.01018: results queue empty 44071 1727204629.01019: checking for any_errors_fatal 44071 1727204629.01029: done checking for any_errors_fatal 44071 1727204629.01030: checking for max_fail_percentage 44071 1727204629.01032: done checking for max_fail_percentage 44071 1727204629.01033: checking to see if all hosts have failed and the running result is not ok 44071 1727204629.01033: done checking to see if all hosts have failed 44071 1727204629.01034: getting the remaining hosts for this loop 44071 1727204629.01036: done getting the remaining hosts for this loop 44071 1727204629.01044: getting the next task for host managed-node2 44071 1727204629.01055: done getting next task for host managed-node2 44071 1727204629.01059: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204629.01068: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204629.01094: getting variables 44071 1727204629.01096: in VariableManager get_vars() 44071 1727204629.01139: Calling all_inventory to load vars for managed-node2 44071 1727204629.01145: Calling groups_inventory to load vars for managed-node2 44071 1727204629.01147: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204629.01163: Calling all_plugins_play to load vars for managed-node2 44071 1727204629.01271: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204629.01278: Calling groups_plugins_play to load vars for managed-node2 44071 1727204629.02186: done sending task result for task 127b8e07-fff9-c964-7471-000000000b36 44071 1727204629.02191: WORKER PROCESS EXITING 44071 1727204629.03448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204629.05704: done with get_vars() 44071 1727204629.05751: done getting variables 44071 1727204629.05820: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:49 -0400 (0:00:00.069) 0:00:41.375 ***** 44071 1727204629.05862: entering _queue_task() for managed-node2/fail 44071 1727204629.06264: worker is 1 (out of 1 available) 44071 1727204629.06282: exiting _queue_task() for managed-node2/fail 44071 1727204629.06298: done queuing things up, now waiting for results queue to drain 44071 1727204629.06300: waiting for pending results... 44071 1727204629.06652: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204629.06850: in run() - task 127b8e07-fff9-c964-7471-000000000b37 44071 1727204629.06880: variable 'ansible_search_path' from source: unknown 44071 1727204629.06891: variable 'ansible_search_path' from source: unknown 44071 1727204629.06937: calling self._execute() 44071 1727204629.07054: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204629.07069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204629.07083: variable 'omit' from source: magic vars 44071 1727204629.07520: variable 'ansible_distribution_major_version' from source: facts 44071 1727204629.07546: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204629.07760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204629.10375: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204629.10462: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204629.10514: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204629.10559: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204629.10599: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204629.10699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204629.10756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204629.10792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.10849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204629.10872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204629.10991: variable 'ansible_distribution_major_version' from source: facts 44071 1727204629.11014: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44071 1727204629.11169: variable 'ansible_distribution' from source: facts 44071 1727204629.11246: variable '__network_rh_distros' from source: role '' defaults 44071 1727204629.11249: Evaluated conditional (ansible_distribution in __network_rh_distros): False 44071 1727204629.11252: when evaluation is False, skipping this task 44071 1727204629.11254: _execute() done 44071 1727204629.11256: dumping result to json 44071 1727204629.11259: done dumping result, returning 44071 1727204629.11262: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-c964-7471-000000000b37] 44071 1727204629.11264: sending task result for task 127b8e07-fff9-c964-7471-000000000b37 44071 1727204629.11595: done sending task result for task 127b8e07-fff9-c964-7471-000000000b37 44071 1727204629.11599: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 44071 1727204629.11652: no more pending results, returning what we have 44071 1727204629.11656: results queue empty 44071 1727204629.11657: checking for any_errors_fatal 44071 1727204629.11663: done checking for any_errors_fatal 44071 1727204629.11664: checking for max_fail_percentage 44071 1727204629.11668: done checking for max_fail_percentage 44071 1727204629.11668: checking to see if all hosts have failed and the running result is not ok 44071 1727204629.11669: done checking to see if all hosts have failed 44071 1727204629.11670: getting the remaining hosts for this loop 44071 1727204629.11672: done getting the remaining hosts for this loop 44071 1727204629.11677: getting the next task for host managed-node2 44071 1727204629.11685: done getting next task for host managed-node2 44071 1727204629.11690: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204629.11696: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204629.11717: getting variables 44071 1727204629.11718: in VariableManager get_vars() 44071 1727204629.11761: Calling all_inventory to load vars for managed-node2 44071 1727204629.11764: Calling groups_inventory to load vars for managed-node2 44071 1727204629.11925: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204629.11937: Calling all_plugins_play to load vars for managed-node2 44071 1727204629.11940: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204629.11946: Calling groups_plugins_play to load vars for managed-node2 44071 1727204629.13697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204629.15948: done with get_vars() 44071 1727204629.15993: done getting variables 44071 1727204629.16061: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:49 -0400 (0:00:00.102) 0:00:41.477 ***** 44071 1727204629.16104: entering _queue_task() for managed-node2/dnf 44071 1727204629.16509: worker is 1 (out of 1 available) 44071 1727204629.16524: exiting _queue_task() for managed-node2/dnf 44071 1727204629.16539: done queuing things up, now waiting for results queue to drain 44071 1727204629.16544: waiting for pending results... 44071 1727204629.16883: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204629.17068: in run() - task 127b8e07-fff9-c964-7471-000000000b38 44071 1727204629.17091: variable 'ansible_search_path' from source: unknown 44071 1727204629.17099: variable 'ansible_search_path' from source: unknown 44071 1727204629.17150: calling self._execute() 44071 1727204629.17271: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204629.17285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204629.17301: variable 'omit' from source: magic vars 44071 1727204629.17764: variable 'ansible_distribution_major_version' from source: facts 44071 1727204629.17789: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204629.18018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204629.22230: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204629.22449: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204629.22454: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204629.22666: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204629.22671: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204629.22798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204629.22838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204629.22878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.23071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204629.23074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204629.23429: variable 'ansible_distribution' from source: facts 44071 1727204629.23432: variable 'ansible_distribution_major_version' from source: facts 44071 1727204629.23435: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44071 1727204629.23676: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204629.23844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204629.23885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204629.23917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.23972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204629.23993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204629.24048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204629.24085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204629.24117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.24168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204629.24194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204629.24246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204629.24279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204629.24316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.24367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204629.24388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204629.24584: variable 'network_connections' from source: include params 44071 1727204629.24601: variable 'interface' from source: play vars 44071 1727204629.24737: variable 'interface' from source: play vars 44071 1727204629.24801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204629.25007: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204629.25054: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204629.25101: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204629.25139: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204629.25203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204629.25232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204629.25371: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.25375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204629.25401: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204629.25702: variable 'network_connections' from source: include params 44071 1727204629.25720: variable 'interface' from source: play vars 44071 1727204629.25798: variable 'interface' from source: play vars 44071 1727204629.25852: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204629.25862: when evaluation is False, skipping this task 44071 1727204629.25874: _execute() done 44071 1727204629.25882: dumping result to json 44071 1727204629.25889: done dumping result, returning 44071 1727204629.25901: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000000b38] 44071 1727204629.25910: sending task result for task 127b8e07-fff9-c964-7471-000000000b38 44071 1727204629.26357: done sending task result for task 127b8e07-fff9-c964-7471-000000000b38 44071 1727204629.26361: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204629.26431: no more pending results, returning what we have 44071 1727204629.26435: results queue empty 44071 1727204629.26437: checking for any_errors_fatal 44071 1727204629.26448: done checking for any_errors_fatal 44071 1727204629.26449: checking for max_fail_percentage 44071 1727204629.26451: done checking for max_fail_percentage 44071 1727204629.26452: checking to see if all hosts have failed and the running result is not ok 44071 1727204629.26453: done checking to see if all hosts have failed 44071 1727204629.26454: getting the remaining hosts for this loop 44071 1727204629.26456: done getting the remaining hosts for this loop 44071 1727204629.26462: getting the next task for host managed-node2 44071 1727204629.26578: done getting next task for host managed-node2 44071 1727204629.26583: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204629.26589: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204629.26612: getting variables 44071 1727204629.26614: in VariableManager get_vars() 44071 1727204629.26658: Calling all_inventory to load vars for managed-node2 44071 1727204629.26661: Calling groups_inventory to load vars for managed-node2 44071 1727204629.26664: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204629.26980: Calling all_plugins_play to load vars for managed-node2 44071 1727204629.26985: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204629.26988: Calling groups_plugins_play to load vars for managed-node2 44071 1727204629.29429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204629.32748: done with get_vars() 44071 1727204629.32794: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204629.32891: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:49 -0400 (0:00:00.168) 0:00:41.645 ***** 44071 1727204629.32930: entering _queue_task() for managed-node2/yum 44071 1727204629.33375: worker is 1 (out of 1 available) 44071 1727204629.33391: exiting _queue_task() for managed-node2/yum 44071 1727204629.33407: done queuing things up, now waiting for results queue to drain 44071 1727204629.33409: waiting for pending results... 44071 1727204629.33750: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204629.34012: in run() - task 127b8e07-fff9-c964-7471-000000000b39 44071 1727204629.34016: variable 'ansible_search_path' from source: unknown 44071 1727204629.34019: variable 'ansible_search_path' from source: unknown 44071 1727204629.34057: calling self._execute() 44071 1727204629.34228: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204629.34232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204629.34236: variable 'omit' from source: magic vars 44071 1727204629.34729: variable 'ansible_distribution_major_version' from source: facts 44071 1727204629.34871: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204629.34992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204629.38909: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204629.39248: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204629.39252: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204629.39275: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204629.39585: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204629.39590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204629.39733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204629.39775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.39829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204629.39852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204629.39999: variable 'ansible_distribution_major_version' from source: facts 44071 1727204629.40029: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44071 1727204629.40037: when evaluation is False, skipping this task 44071 1727204629.40046: _execute() done 44071 1727204629.40054: dumping result to json 44071 1727204629.40061: done dumping result, returning 44071 1727204629.40076: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000000b39] 44071 1727204629.40087: sending task result for task 127b8e07-fff9-c964-7471-000000000b39 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44071 1727204629.40272: no more pending results, returning what we have 44071 1727204629.40277: results queue empty 44071 1727204629.40278: checking for any_errors_fatal 44071 1727204629.40287: done checking for any_errors_fatal 44071 1727204629.40288: checking for max_fail_percentage 44071 1727204629.40290: done checking for max_fail_percentage 44071 1727204629.40291: checking to see if all hosts have failed and the running result is not ok 44071 1727204629.40292: done checking to see if all hosts have failed 44071 1727204629.40292: getting the remaining hosts for this loop 44071 1727204629.40294: done getting the remaining hosts for this loop 44071 1727204629.40300: getting the next task for host managed-node2 44071 1727204629.40309: done getting next task for host managed-node2 44071 1727204629.40314: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204629.40319: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204629.40346: getting variables 44071 1727204629.40348: in VariableManager get_vars() 44071 1727204629.40495: Calling all_inventory to load vars for managed-node2 44071 1727204629.40499: Calling groups_inventory to load vars for managed-node2 44071 1727204629.40501: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204629.40514: Calling all_plugins_play to load vars for managed-node2 44071 1727204629.40518: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204629.40522: Calling groups_plugins_play to load vars for managed-node2 44071 1727204629.41231: done sending task result for task 127b8e07-fff9-c964-7471-000000000b39 44071 1727204629.41236: WORKER PROCESS EXITING 44071 1727204629.44076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204629.46922: done with get_vars() 44071 1727204629.46967: done getting variables 44071 1727204629.47034: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:49 -0400 (0:00:00.141) 0:00:41.787 ***** 44071 1727204629.47084: entering _queue_task() for managed-node2/fail 44071 1727204629.47483: worker is 1 (out of 1 available) 44071 1727204629.47499: exiting _queue_task() for managed-node2/fail 44071 1727204629.47513: done queuing things up, now waiting for results queue to drain 44071 1727204629.47515: waiting for pending results... 44071 1727204629.47860: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204629.48046: in run() - task 127b8e07-fff9-c964-7471-000000000b3a 44071 1727204629.48072: variable 'ansible_search_path' from source: unknown 44071 1727204629.48081: variable 'ansible_search_path' from source: unknown 44071 1727204629.48130: calling self._execute() 44071 1727204629.48245: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204629.48259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204629.48277: variable 'omit' from source: magic vars 44071 1727204629.48805: variable 'ansible_distribution_major_version' from source: facts 44071 1727204629.48826: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204629.49201: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204629.49376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204629.52246: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204629.52335: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204629.52388: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204629.52430: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204629.52470: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204629.52573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204629.52628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204629.52772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.52775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204629.52778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204629.52796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204629.52826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204629.52860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.52911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204629.52930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204629.52986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204629.53017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204629.53051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.53103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204629.53122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204629.53557: variable 'network_connections' from source: include params 44071 1727204629.53592: variable 'interface' from source: play vars 44071 1727204629.53818: variable 'interface' from source: play vars 44071 1727204629.54000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204629.54316: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204629.54368: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204629.54414: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204629.54456: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204629.54513: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204629.54772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204629.54776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.54779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204629.54781: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204629.55297: variable 'network_connections' from source: include params 44071 1727204629.55310: variable 'interface' from source: play vars 44071 1727204629.55877: variable 'interface' from source: play vars 44071 1727204629.55882: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204629.55887: when evaluation is False, skipping this task 44071 1727204629.55890: _execute() done 44071 1727204629.55892: dumping result to json 44071 1727204629.55894: done dumping result, returning 44071 1727204629.55897: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000000b3a] 44071 1727204629.55899: sending task result for task 127b8e07-fff9-c964-7471-000000000b3a skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204629.56446: no more pending results, returning what we have 44071 1727204629.56451: results queue empty 44071 1727204629.56452: checking for any_errors_fatal 44071 1727204629.56469: done checking for any_errors_fatal 44071 1727204629.56470: checking for max_fail_percentage 44071 1727204629.56472: done checking for max_fail_percentage 44071 1727204629.56473: checking to see if all hosts have failed and the running result is not ok 44071 1727204629.56474: done checking to see if all hosts have failed 44071 1727204629.56475: getting the remaining hosts for this loop 44071 1727204629.56570: done getting the remaining hosts for this loop 44071 1727204629.56577: getting the next task for host managed-node2 44071 1727204629.56595: done getting next task for host managed-node2 44071 1727204629.56602: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44071 1727204629.56608: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204629.56628: done sending task result for task 127b8e07-fff9-c964-7471-000000000b3a 44071 1727204629.56632: WORKER PROCESS EXITING 44071 1727204629.56655: getting variables 44071 1727204629.56657: in VariableManager get_vars() 44071 1727204629.56895: Calling all_inventory to load vars for managed-node2 44071 1727204629.56902: Calling groups_inventory to load vars for managed-node2 44071 1727204629.56906: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204629.56922: Calling all_plugins_play to load vars for managed-node2 44071 1727204629.56926: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204629.56930: Calling groups_plugins_play to load vars for managed-node2 44071 1727204629.60010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204629.63220: done with get_vars() 44071 1727204629.63324: done getting variables 44071 1727204629.63475: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:49 -0400 (0:00:00.164) 0:00:41.952 ***** 44071 1727204629.63590: entering _queue_task() for managed-node2/package 44071 1727204629.64213: worker is 1 (out of 1 available) 44071 1727204629.64234: exiting _queue_task() for managed-node2/package 44071 1727204629.64251: done queuing things up, now waiting for results queue to drain 44071 1727204629.64253: waiting for pending results... 44071 1727204629.64738: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 44071 1727204629.64988: in run() - task 127b8e07-fff9-c964-7471-000000000b3b 44071 1727204629.65103: variable 'ansible_search_path' from source: unknown 44071 1727204629.65107: variable 'ansible_search_path' from source: unknown 44071 1727204629.65111: calling self._execute() 44071 1727204629.65213: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204629.65231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204629.65252: variable 'omit' from source: magic vars 44071 1727204629.65754: variable 'ansible_distribution_major_version' from source: facts 44071 1727204629.65782: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204629.66056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204629.66598: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204629.66603: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204629.66607: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204629.67340: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204629.67509: variable 'network_packages' from source: role '' defaults 44071 1727204629.67655: variable '__network_provider_setup' from source: role '' defaults 44071 1727204629.67684: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204629.67764: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204629.67781: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204629.67875: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204629.68109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204629.71642: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204629.71878: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204629.71882: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204629.71891: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204629.71971: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204629.72074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204629.72104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204629.72168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.72248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204629.72259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204629.72313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204629.72346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204629.72369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.72432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204629.72452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204629.72813: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204629.73066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204629.73138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204629.73180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.73273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204629.73325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204629.73497: variable 'ansible_python' from source: facts 44071 1727204629.73524: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204629.73691: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204629.73832: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204629.74211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204629.74216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204629.74219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.74289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204629.74312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204629.74412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204629.74446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204629.74530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.74558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204629.74580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204629.74857: variable 'network_connections' from source: include params 44071 1727204629.74863: variable 'interface' from source: play vars 44071 1727204629.74931: variable 'interface' from source: play vars 44071 1727204629.75021: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204629.75050: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204629.75109: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204629.75120: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204629.75185: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204629.75627: variable 'network_connections' from source: include params 44071 1727204629.75631: variable 'interface' from source: play vars 44071 1727204629.75697: variable 'interface' from source: play vars 44071 1727204629.75804: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204629.75981: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204629.76579: variable 'network_connections' from source: include params 44071 1727204629.76586: variable 'interface' from source: play vars 44071 1727204629.76682: variable 'interface' from source: play vars 44071 1727204629.76744: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204629.76882: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204629.77332: variable 'network_connections' from source: include params 44071 1727204629.77336: variable 'interface' from source: play vars 44071 1727204629.77458: variable 'interface' from source: play vars 44071 1727204629.77619: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204629.77636: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204629.77645: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204629.77709: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204629.78016: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204629.78660: variable 'network_connections' from source: include params 44071 1727204629.78714: variable 'interface' from source: play vars 44071 1727204629.78893: variable 'interface' from source: play vars 44071 1727204629.78897: variable 'ansible_distribution' from source: facts 44071 1727204629.78899: variable '__network_rh_distros' from source: role '' defaults 44071 1727204629.78901: variable 'ansible_distribution_major_version' from source: facts 44071 1727204629.78903: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204629.79095: variable 'ansible_distribution' from source: facts 44071 1727204629.79099: variable '__network_rh_distros' from source: role '' defaults 44071 1727204629.79110: variable 'ansible_distribution_major_version' from source: facts 44071 1727204629.79112: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204629.79349: variable 'ansible_distribution' from source: facts 44071 1727204629.79357: variable '__network_rh_distros' from source: role '' defaults 44071 1727204629.79364: variable 'ansible_distribution_major_version' from source: facts 44071 1727204629.79410: variable 'network_provider' from source: set_fact 44071 1727204629.79435: variable 'ansible_facts' from source: unknown 44071 1727204629.80670: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44071 1727204629.80675: when evaluation is False, skipping this task 44071 1727204629.80678: _execute() done 44071 1727204629.80680: dumping result to json 44071 1727204629.80683: done dumping result, returning 44071 1727204629.80736: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-c964-7471-000000000b3b] 44071 1727204629.80744: sending task result for task 127b8e07-fff9-c964-7471-000000000b3b 44071 1727204629.80826: done sending task result for task 127b8e07-fff9-c964-7471-000000000b3b 44071 1727204629.80829: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44071 1727204629.80890: no more pending results, returning what we have 44071 1727204629.80894: results queue empty 44071 1727204629.80895: checking for any_errors_fatal 44071 1727204629.80903: done checking for any_errors_fatal 44071 1727204629.80904: checking for max_fail_percentage 44071 1727204629.80905: done checking for max_fail_percentage 44071 1727204629.80906: checking to see if all hosts have failed and the running result is not ok 44071 1727204629.80907: done checking to see if all hosts have failed 44071 1727204629.80907: getting the remaining hosts for this loop 44071 1727204629.80914: done getting the remaining hosts for this loop 44071 1727204629.80919: getting the next task for host managed-node2 44071 1727204629.80931: done getting next task for host managed-node2 44071 1727204629.80936: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204629.80943: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204629.80967: getting variables 44071 1727204629.80969: in VariableManager get_vars() 44071 1727204629.81011: Calling all_inventory to load vars for managed-node2 44071 1727204629.81014: Calling groups_inventory to load vars for managed-node2 44071 1727204629.81016: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204629.81026: Calling all_plugins_play to load vars for managed-node2 44071 1727204629.81029: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204629.81032: Calling groups_plugins_play to load vars for managed-node2 44071 1727204629.83443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204629.86084: done with get_vars() 44071 1727204629.86114: done getting variables 44071 1727204629.86201: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:49 -0400 (0:00:00.226) 0:00:42.178 ***** 44071 1727204629.86230: entering _queue_task() for managed-node2/package 44071 1727204629.86577: worker is 1 (out of 1 available) 44071 1727204629.86594: exiting _queue_task() for managed-node2/package 44071 1727204629.86609: done queuing things up, now waiting for results queue to drain 44071 1727204629.86611: waiting for pending results... 44071 1727204629.86853: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204629.86985: in run() - task 127b8e07-fff9-c964-7471-000000000b3c 44071 1727204629.87020: variable 'ansible_search_path' from source: unknown 44071 1727204629.87024: variable 'ansible_search_path' from source: unknown 44071 1727204629.87057: calling self._execute() 44071 1727204629.87145: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204629.87153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204629.87162: variable 'omit' from source: magic vars 44071 1727204629.87518: variable 'ansible_distribution_major_version' from source: facts 44071 1727204629.87521: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204629.87630: variable 'network_state' from source: role '' defaults 44071 1727204629.87642: Evaluated conditional (network_state != {}): False 44071 1727204629.87652: when evaluation is False, skipping this task 44071 1727204629.87655: _execute() done 44071 1727204629.87658: dumping result to json 44071 1727204629.87660: done dumping result, returning 44071 1727204629.87669: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-c964-7471-000000000b3c] 44071 1727204629.87672: sending task result for task 127b8e07-fff9-c964-7471-000000000b3c 44071 1727204629.87797: done sending task result for task 127b8e07-fff9-c964-7471-000000000b3c 44071 1727204629.87799: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204629.87854: no more pending results, returning what we have 44071 1727204629.87858: results queue empty 44071 1727204629.87859: checking for any_errors_fatal 44071 1727204629.87868: done checking for any_errors_fatal 44071 1727204629.87869: checking for max_fail_percentage 44071 1727204629.87871: done checking for max_fail_percentage 44071 1727204629.87871: checking to see if all hosts have failed and the running result is not ok 44071 1727204629.87872: done checking to see if all hosts have failed 44071 1727204629.87873: getting the remaining hosts for this loop 44071 1727204629.87874: done getting the remaining hosts for this loop 44071 1727204629.87879: getting the next task for host managed-node2 44071 1727204629.87888: done getting next task for host managed-node2 44071 1727204629.87893: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204629.87899: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204629.87924: getting variables 44071 1727204629.87925: in VariableManager get_vars() 44071 1727204629.87962: Calling all_inventory to load vars for managed-node2 44071 1727204629.87974: Calling groups_inventory to load vars for managed-node2 44071 1727204629.87977: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204629.87987: Calling all_plugins_play to load vars for managed-node2 44071 1727204629.87990: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204629.87992: Calling groups_plugins_play to load vars for managed-node2 44071 1727204629.89362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204629.90855: done with get_vars() 44071 1727204629.90887: done getting variables 44071 1727204629.90942: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:49 -0400 (0:00:00.047) 0:00:42.226 ***** 44071 1727204629.90974: entering _queue_task() for managed-node2/package 44071 1727204629.91263: worker is 1 (out of 1 available) 44071 1727204629.91281: exiting _queue_task() for managed-node2/package 44071 1727204629.91297: done queuing things up, now waiting for results queue to drain 44071 1727204629.91298: waiting for pending results... 44071 1727204629.91520: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204629.91629: in run() - task 127b8e07-fff9-c964-7471-000000000b3d 44071 1727204629.91646: variable 'ansible_search_path' from source: unknown 44071 1727204629.91649: variable 'ansible_search_path' from source: unknown 44071 1727204629.91684: calling self._execute() 44071 1727204629.91768: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204629.91774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204629.91783: variable 'omit' from source: magic vars 44071 1727204629.92104: variable 'ansible_distribution_major_version' from source: facts 44071 1727204629.92114: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204629.92209: variable 'network_state' from source: role '' defaults 44071 1727204629.92220: Evaluated conditional (network_state != {}): False 44071 1727204629.92224: when evaluation is False, skipping this task 44071 1727204629.92227: _execute() done 44071 1727204629.92230: dumping result to json 44071 1727204629.92234: done dumping result, returning 44071 1727204629.92246: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-c964-7471-000000000b3d] 44071 1727204629.92249: sending task result for task 127b8e07-fff9-c964-7471-000000000b3d 44071 1727204629.92379: done sending task result for task 127b8e07-fff9-c964-7471-000000000b3d 44071 1727204629.92381: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204629.92447: no more pending results, returning what we have 44071 1727204629.92451: results queue empty 44071 1727204629.92453: checking for any_errors_fatal 44071 1727204629.92462: done checking for any_errors_fatal 44071 1727204629.92463: checking for max_fail_percentage 44071 1727204629.92464: done checking for max_fail_percentage 44071 1727204629.92467: checking to see if all hosts have failed and the running result is not ok 44071 1727204629.92468: done checking to see if all hosts have failed 44071 1727204629.92468: getting the remaining hosts for this loop 44071 1727204629.92470: done getting the remaining hosts for this loop 44071 1727204629.92475: getting the next task for host managed-node2 44071 1727204629.92484: done getting next task for host managed-node2 44071 1727204629.92489: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204629.92494: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204629.92516: getting variables 44071 1727204629.92518: in VariableManager get_vars() 44071 1727204629.92562: Calling all_inventory to load vars for managed-node2 44071 1727204629.92576: Calling groups_inventory to load vars for managed-node2 44071 1727204629.92580: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204629.92595: Calling all_plugins_play to load vars for managed-node2 44071 1727204629.92598: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204629.92600: Calling groups_plugins_play to load vars for managed-node2 44071 1727204629.94277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204629.96483: done with get_vars() 44071 1727204629.96527: done getting variables 44071 1727204629.96602: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:49 -0400 (0:00:00.056) 0:00:42.282 ***** 44071 1727204629.96643: entering _queue_task() for managed-node2/service 44071 1727204629.97038: worker is 1 (out of 1 available) 44071 1727204629.97053: exiting _queue_task() for managed-node2/service 44071 1727204629.97271: done queuing things up, now waiting for results queue to drain 44071 1727204629.97274: waiting for pending results... 44071 1727204629.97625: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204629.97632: in run() - task 127b8e07-fff9-c964-7471-000000000b3e 44071 1727204629.97635: variable 'ansible_search_path' from source: unknown 44071 1727204629.97638: variable 'ansible_search_path' from source: unknown 44071 1727204629.97664: calling self._execute() 44071 1727204629.97781: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204629.97795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204629.97812: variable 'omit' from source: magic vars 44071 1727204629.98255: variable 'ansible_distribution_major_version' from source: facts 44071 1727204629.98281: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204629.98417: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204629.98651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204630.01799: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204630.01968: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204630.01975: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204630.01983: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204630.02015: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204630.02111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204630.02147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204630.02181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204630.02232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204630.02251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204630.02308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204630.02340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204630.02374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204630.02570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204630.02574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204630.02577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204630.02579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204630.02581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204630.02593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204630.02612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204630.02809: variable 'network_connections' from source: include params 44071 1727204630.02828: variable 'interface' from source: play vars 44071 1727204630.02908: variable 'interface' from source: play vars 44071 1727204630.02998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204630.03243: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204630.03278: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204630.03351: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204630.03354: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204630.03405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204630.03432: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204630.03471: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204630.03504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204630.03580: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204630.03892: variable 'network_connections' from source: include params 44071 1727204630.03895: variable 'interface' from source: play vars 44071 1727204630.03955: variable 'interface' from source: play vars 44071 1727204630.04002: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204630.04070: when evaluation is False, skipping this task 44071 1727204630.04073: _execute() done 44071 1727204630.04076: dumping result to json 44071 1727204630.04078: done dumping result, returning 44071 1727204630.04081: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000000b3e] 44071 1727204630.04083: sending task result for task 127b8e07-fff9-c964-7471-000000000b3e 44071 1727204630.04376: done sending task result for task 127b8e07-fff9-c964-7471-000000000b3e 44071 1727204630.04387: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204630.04439: no more pending results, returning what we have 44071 1727204630.04443: results queue empty 44071 1727204630.04444: checking for any_errors_fatal 44071 1727204630.04452: done checking for any_errors_fatal 44071 1727204630.04452: checking for max_fail_percentage 44071 1727204630.04454: done checking for max_fail_percentage 44071 1727204630.04455: checking to see if all hosts have failed and the running result is not ok 44071 1727204630.04456: done checking to see if all hosts have failed 44071 1727204630.04456: getting the remaining hosts for this loop 44071 1727204630.04458: done getting the remaining hosts for this loop 44071 1727204630.04463: getting the next task for host managed-node2 44071 1727204630.04474: done getting next task for host managed-node2 44071 1727204630.04479: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204630.04484: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204630.04506: getting variables 44071 1727204630.04508: in VariableManager get_vars() 44071 1727204630.04549: Calling all_inventory to load vars for managed-node2 44071 1727204630.04552: Calling groups_inventory to load vars for managed-node2 44071 1727204630.04554: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204630.04771: Calling all_plugins_play to load vars for managed-node2 44071 1727204630.04776: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204630.04781: Calling groups_plugins_play to load vars for managed-node2 44071 1727204630.06598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204630.08785: done with get_vars() 44071 1727204630.08826: done getting variables 44071 1727204630.08894: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:50 -0400 (0:00:00.122) 0:00:42.405 ***** 44071 1727204630.08936: entering _queue_task() for managed-node2/service 44071 1727204630.09328: worker is 1 (out of 1 available) 44071 1727204630.09341: exiting _queue_task() for managed-node2/service 44071 1727204630.09357: done queuing things up, now waiting for results queue to drain 44071 1727204630.09359: waiting for pending results... 44071 1727204630.09708: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204630.09880: in run() - task 127b8e07-fff9-c964-7471-000000000b3f 44071 1727204630.09905: variable 'ansible_search_path' from source: unknown 44071 1727204630.09914: variable 'ansible_search_path' from source: unknown 44071 1727204630.09961: calling self._execute() 44071 1727204630.10088: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204630.10113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204630.10124: variable 'omit' from source: magic vars 44071 1727204630.10494: variable 'ansible_distribution_major_version' from source: facts 44071 1727204630.10508: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204630.10639: variable 'network_provider' from source: set_fact 44071 1727204630.10643: variable 'network_state' from source: role '' defaults 44071 1727204630.10656: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44071 1727204630.10662: variable 'omit' from source: magic vars 44071 1727204630.10713: variable 'omit' from source: magic vars 44071 1727204630.10736: variable 'network_service_name' from source: role '' defaults 44071 1727204630.10794: variable 'network_service_name' from source: role '' defaults 44071 1727204630.10878: variable '__network_provider_setup' from source: role '' defaults 44071 1727204630.10883: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204630.10935: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204630.10942: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204630.10992: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204630.11169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204630.13274: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204630.13278: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204630.13281: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204630.13307: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204630.13341: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204630.13432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204630.13471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204630.13491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204630.13519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204630.13539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204630.13575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204630.13593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204630.13611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204630.13642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204630.13662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204630.13851: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204630.13946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204630.13963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204630.13987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204630.14014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204630.14026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204630.14100: variable 'ansible_python' from source: facts 44071 1727204630.14114: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204630.14177: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204630.14238: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204630.14333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204630.14353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204630.14372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204630.14399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204630.14415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204630.14454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204630.14476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204630.14493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204630.14522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204630.14535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204630.14644: variable 'network_connections' from source: include params 44071 1727204630.14648: variable 'interface' from source: play vars 44071 1727204630.14702: variable 'interface' from source: play vars 44071 1727204630.14789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204630.14928: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204630.14971: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204630.15006: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204630.15037: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204630.15255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204630.15283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204630.15310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204630.15335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204630.15377: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204630.15592: variable 'network_connections' from source: include params 44071 1727204630.15598: variable 'interface' from source: play vars 44071 1727204630.15662: variable 'interface' from source: play vars 44071 1727204630.15703: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204630.15771: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204630.15977: variable 'network_connections' from source: include params 44071 1727204630.15981: variable 'interface' from source: play vars 44071 1727204630.16033: variable 'interface' from source: play vars 44071 1727204630.16057: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204630.16115: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204630.16322: variable 'network_connections' from source: include params 44071 1727204630.16325: variable 'interface' from source: play vars 44071 1727204630.16382: variable 'interface' from source: play vars 44071 1727204630.16431: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204630.16477: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204630.16484: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204630.16530: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204630.16682: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204630.17024: variable 'network_connections' from source: include params 44071 1727204630.17029: variable 'interface' from source: play vars 44071 1727204630.17077: variable 'interface' from source: play vars 44071 1727204630.17084: variable 'ansible_distribution' from source: facts 44071 1727204630.17087: variable '__network_rh_distros' from source: role '' defaults 44071 1727204630.17094: variable 'ansible_distribution_major_version' from source: facts 44071 1727204630.17113: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204630.17237: variable 'ansible_distribution' from source: facts 44071 1727204630.17243: variable '__network_rh_distros' from source: role '' defaults 44071 1727204630.17246: variable 'ansible_distribution_major_version' from source: facts 44071 1727204630.17256: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204630.17388: variable 'ansible_distribution' from source: facts 44071 1727204630.17392: variable '__network_rh_distros' from source: role '' defaults 44071 1727204630.17397: variable 'ansible_distribution_major_version' from source: facts 44071 1727204630.17424: variable 'network_provider' from source: set_fact 44071 1727204630.17446: variable 'omit' from source: magic vars 44071 1727204630.17474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204630.17496: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204630.17513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204630.17527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204630.17536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204630.17563: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204630.17567: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204630.17571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204630.17645: Set connection var ansible_connection to ssh 44071 1727204630.17656: Set connection var ansible_timeout to 10 44071 1727204630.17660: Set connection var ansible_pipelining to False 44071 1727204630.17670: Set connection var ansible_shell_type to sh 44071 1727204630.17673: Set connection var ansible_shell_executable to /bin/sh 44071 1727204630.17688: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204630.17775: variable 'ansible_shell_executable' from source: unknown 44071 1727204630.17778: variable 'ansible_connection' from source: unknown 44071 1727204630.17780: variable 'ansible_module_compression' from source: unknown 44071 1727204630.17782: variable 'ansible_shell_type' from source: unknown 44071 1727204630.17784: variable 'ansible_shell_executable' from source: unknown 44071 1727204630.17785: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204630.17787: variable 'ansible_pipelining' from source: unknown 44071 1727204630.17789: variable 'ansible_timeout' from source: unknown 44071 1727204630.17791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204630.17913: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204630.17935: variable 'omit' from source: magic vars 44071 1727204630.17945: starting attempt loop 44071 1727204630.17952: running the handler 44071 1727204630.18040: variable 'ansible_facts' from source: unknown 44071 1727204630.24885: _low_level_execute_command(): starting 44071 1727204630.24890: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204630.25428: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204630.25434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204630.25438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204630.25489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204630.25492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204630.25497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204630.25580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204630.27343: stdout chunk (state=3): >>>/root <<< 44071 1727204630.27453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204630.27525: stderr chunk (state=3): >>><<< 44071 1727204630.27528: stdout chunk (state=3): >>><<< 44071 1727204630.27543: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204630.27555: _low_level_execute_command(): starting 44071 1727204630.27562: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204630.2754521-46551-144214778094294 `" && echo ansible-tmp-1727204630.2754521-46551-144214778094294="` echo /root/.ansible/tmp/ansible-tmp-1727204630.2754521-46551-144214778094294 `" ) && sleep 0' 44071 1727204630.28170: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204630.28247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204630.28250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204630.28333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204630.30293: stdout chunk (state=3): >>>ansible-tmp-1727204630.2754521-46551-144214778094294=/root/.ansible/tmp/ansible-tmp-1727204630.2754521-46551-144214778094294 <<< 44071 1727204630.30411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204630.30480: stderr chunk (state=3): >>><<< 44071 1727204630.30483: stdout chunk (state=3): >>><<< 44071 1727204630.30503: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204630.2754521-46551-144214778094294=/root/.ansible/tmp/ansible-tmp-1727204630.2754521-46551-144214778094294 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204630.30544: variable 'ansible_module_compression' from source: unknown 44071 1727204630.30585: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44071 1727204630.30637: variable 'ansible_facts' from source: unknown 44071 1727204630.30778: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204630.2754521-46551-144214778094294/AnsiballZ_systemd.py 44071 1727204630.30901: Sending initial data 44071 1727204630.30904: Sent initial data (156 bytes) 44071 1727204630.31618: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204630.31688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204630.31705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204630.31730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204630.31944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204630.33457: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204630.33524: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204630.33593: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpznj5ylbe /root/.ansible/tmp/ansible-tmp-1727204630.2754521-46551-144214778094294/AnsiballZ_systemd.py <<< 44071 1727204630.33602: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204630.2754521-46551-144214778094294/AnsiballZ_systemd.py" <<< 44071 1727204630.33661: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpznj5ylbe" to remote "/root/.ansible/tmp/ansible-tmp-1727204630.2754521-46551-144214778094294/AnsiballZ_systemd.py" <<< 44071 1727204630.33671: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204630.2754521-46551-144214778094294/AnsiballZ_systemd.py" <<< 44071 1727204630.34950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204630.35028: stderr chunk (state=3): >>><<< 44071 1727204630.35032: stdout chunk (state=3): >>><<< 44071 1727204630.35054: done transferring module to remote 44071 1727204630.35064: _low_level_execute_command(): starting 44071 1727204630.35072: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204630.2754521-46551-144214778094294/ /root/.ansible/tmp/ansible-tmp-1727204630.2754521-46551-144214778094294/AnsiballZ_systemd.py && sleep 0' 44071 1727204630.35538: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204630.35542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204630.35577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204630.35580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204630.35583: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204630.35585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204630.35648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204630.35651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204630.35653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204630.35763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204630.37800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204630.37805: stdout chunk (state=3): >>><<< 44071 1727204630.37807: stderr chunk (state=3): >>><<< 44071 1727204630.37809: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204630.37811: _low_level_execute_command(): starting 44071 1727204630.37813: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204630.2754521-46551-144214778094294/AnsiballZ_systemd.py && sleep 0' 44071 1727204630.38368: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204630.38389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204630.38405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204630.38426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204630.38446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204630.38459: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204630.38553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204630.38583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204630.38690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204630.70539: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4509696", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3504476160", "CPUUsageNSec": "1491566000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44071 1727204630.72678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204630.72683: stdout chunk (state=3): >>><<< 44071 1727204630.72686: stderr chunk (state=3): >>><<< 44071 1727204630.72690: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4509696", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3504476160", "CPUUsageNSec": "1491566000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204630.73074: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204630.2754521-46551-144214778094294/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204630.73078: _low_level_execute_command(): starting 44071 1727204630.73081: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204630.2754521-46551-144214778094294/ > /dev/null 2>&1 && sleep 0' 44071 1727204630.75011: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204630.75022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204630.75029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204630.75047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204630.75106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204630.75123: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204630.75336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204630.75456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204630.75576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204630.77481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204630.77550: stderr chunk (state=3): >>><<< 44071 1727204630.77554: stdout chunk (state=3): >>><<< 44071 1727204630.77569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204630.77577: handler run complete 44071 1727204630.77660: attempt loop complete, returning result 44071 1727204630.77667: _execute() done 44071 1727204630.77670: dumping result to json 44071 1727204630.77716: done dumping result, returning 44071 1727204630.77719: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-c964-7471-000000000b3f] 44071 1727204630.77722: sending task result for task 127b8e07-fff9-c964-7471-000000000b3f 44071 1727204630.77957: done sending task result for task 127b8e07-fff9-c964-7471-000000000b3f 44071 1727204630.77960: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204630.78025: no more pending results, returning what we have 44071 1727204630.78028: results queue empty 44071 1727204630.78029: checking for any_errors_fatal 44071 1727204630.78035: done checking for any_errors_fatal 44071 1727204630.78036: checking for max_fail_percentage 44071 1727204630.78038: done checking for max_fail_percentage 44071 1727204630.78038: checking to see if all hosts have failed and the running result is not ok 44071 1727204630.78039: done checking to see if all hosts have failed 44071 1727204630.78042: getting the remaining hosts for this loop 44071 1727204630.78044: done getting the remaining hosts for this loop 44071 1727204630.78048: getting the next task for host managed-node2 44071 1727204630.78056: done getting next task for host managed-node2 44071 1727204630.78059: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204630.78064: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204630.78081: getting variables 44071 1727204630.78082: in VariableManager get_vars() 44071 1727204630.78118: Calling all_inventory to load vars for managed-node2 44071 1727204630.78121: Calling groups_inventory to load vars for managed-node2 44071 1727204630.78161: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204630.78206: Calling all_plugins_play to load vars for managed-node2 44071 1727204630.78235: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204630.78242: Calling groups_plugins_play to load vars for managed-node2 44071 1727204630.85819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204630.87396: done with get_vars() 44071 1727204630.87427: done getting variables 44071 1727204630.87472: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:50 -0400 (0:00:00.785) 0:00:43.191 ***** 44071 1727204630.87501: entering _queue_task() for managed-node2/service 44071 1727204630.87817: worker is 1 (out of 1 available) 44071 1727204630.87834: exiting _queue_task() for managed-node2/service 44071 1727204630.87849: done queuing things up, now waiting for results queue to drain 44071 1727204630.87853: waiting for pending results... 44071 1727204630.88080: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204630.88379: in run() - task 127b8e07-fff9-c964-7471-000000000b40 44071 1727204630.88385: variable 'ansible_search_path' from source: unknown 44071 1727204630.88391: variable 'ansible_search_path' from source: unknown 44071 1727204630.88441: calling self._execute() 44071 1727204630.88573: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204630.88633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204630.88637: variable 'omit' from source: magic vars 44071 1727204630.89081: variable 'ansible_distribution_major_version' from source: facts 44071 1727204630.89092: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204630.89231: variable 'network_provider' from source: set_fact 44071 1727204630.89250: Evaluated conditional (network_provider == "nm"): True 44071 1727204630.89388: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204630.89476: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204630.89648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204630.92468: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204630.92473: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204630.92476: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204630.92479: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204630.92496: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204630.92616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204630.92643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204630.92668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204630.92697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204630.92708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204630.92747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204630.92771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204630.92794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204630.92821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204630.92833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204630.92867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204630.92891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204630.92909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204630.92935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204630.92948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204630.93068: variable 'network_connections' from source: include params 44071 1727204630.93089: variable 'interface' from source: play vars 44071 1727204630.93164: variable 'interface' from source: play vars 44071 1727204630.93242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204630.93419: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204630.93470: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204630.93492: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204630.93514: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204630.93552: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204630.93574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204630.93605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204630.93626: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204630.93675: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204630.93957: variable 'network_connections' from source: include params 44071 1727204630.93961: variable 'interface' from source: play vars 44071 1727204630.94020: variable 'interface' from source: play vars 44071 1727204630.94063: Evaluated conditional (__network_wpa_supplicant_required): False 44071 1727204630.94068: when evaluation is False, skipping this task 44071 1727204630.94071: _execute() done 44071 1727204630.94073: dumping result to json 44071 1727204630.94075: done dumping result, returning 44071 1727204630.94084: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-c964-7471-000000000b40] 44071 1727204630.94100: sending task result for task 127b8e07-fff9-c964-7471-000000000b40 44071 1727204630.94200: done sending task result for task 127b8e07-fff9-c964-7471-000000000b40 44071 1727204630.94203: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44071 1727204630.94276: no more pending results, returning what we have 44071 1727204630.94279: results queue empty 44071 1727204630.94280: checking for any_errors_fatal 44071 1727204630.94301: done checking for any_errors_fatal 44071 1727204630.94301: checking for max_fail_percentage 44071 1727204630.94303: done checking for max_fail_percentage 44071 1727204630.94304: checking to see if all hosts have failed and the running result is not ok 44071 1727204630.94305: done checking to see if all hosts have failed 44071 1727204630.94305: getting the remaining hosts for this loop 44071 1727204630.94307: done getting the remaining hosts for this loop 44071 1727204630.94313: getting the next task for host managed-node2 44071 1727204630.94322: done getting next task for host managed-node2 44071 1727204630.94329: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204630.94334: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204630.94356: getting variables 44071 1727204630.94358: in VariableManager get_vars() 44071 1727204630.94395: Calling all_inventory to load vars for managed-node2 44071 1727204630.94398: Calling groups_inventory to load vars for managed-node2 44071 1727204630.94400: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204630.94412: Calling all_plugins_play to load vars for managed-node2 44071 1727204630.94415: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204630.94418: Calling groups_plugins_play to load vars for managed-node2 44071 1727204630.96199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204630.98495: done with get_vars() 44071 1727204630.98539: done getting variables 44071 1727204630.98623: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:50 -0400 (0:00:00.111) 0:00:43.303 ***** 44071 1727204630.98664: entering _queue_task() for managed-node2/service 44071 1727204630.99299: worker is 1 (out of 1 available) 44071 1727204630.99315: exiting _queue_task() for managed-node2/service 44071 1727204630.99327: done queuing things up, now waiting for results queue to drain 44071 1727204630.99329: waiting for pending results... 44071 1727204630.99569: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204630.99631: in run() - task 127b8e07-fff9-c964-7471-000000000b41 44071 1727204630.99656: variable 'ansible_search_path' from source: unknown 44071 1727204630.99675: variable 'ansible_search_path' from source: unknown 44071 1727204630.99721: calling self._execute() 44071 1727204630.99835: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204630.99883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204630.99887: variable 'omit' from source: magic vars 44071 1727204631.00289: variable 'ansible_distribution_major_version' from source: facts 44071 1727204631.00313: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204631.00452: variable 'network_provider' from source: set_fact 44071 1727204631.00540: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204631.00543: when evaluation is False, skipping this task 44071 1727204631.00546: _execute() done 44071 1727204631.00549: dumping result to json 44071 1727204631.00551: done dumping result, returning 44071 1727204631.00554: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-c964-7471-000000000b41] 44071 1727204631.00557: sending task result for task 127b8e07-fff9-c964-7471-000000000b41 44071 1727204631.00635: done sending task result for task 127b8e07-fff9-c964-7471-000000000b41 44071 1727204631.00639: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204631.00696: no more pending results, returning what we have 44071 1727204631.00701: results queue empty 44071 1727204631.00702: checking for any_errors_fatal 44071 1727204631.00711: done checking for any_errors_fatal 44071 1727204631.00711: checking for max_fail_percentage 44071 1727204631.00713: done checking for max_fail_percentage 44071 1727204631.00714: checking to see if all hosts have failed and the running result is not ok 44071 1727204631.00715: done checking to see if all hosts have failed 44071 1727204631.00715: getting the remaining hosts for this loop 44071 1727204631.00717: done getting the remaining hosts for this loop 44071 1727204631.00723: getting the next task for host managed-node2 44071 1727204631.00732: done getting next task for host managed-node2 44071 1727204631.00736: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204631.00743: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204631.00774: getting variables 44071 1727204631.00777: in VariableManager get_vars() 44071 1727204631.00820: Calling all_inventory to load vars for managed-node2 44071 1727204631.00825: Calling groups_inventory to load vars for managed-node2 44071 1727204631.00829: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204631.00845: Calling all_plugins_play to load vars for managed-node2 44071 1727204631.00849: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204631.00852: Calling groups_plugins_play to load vars for managed-node2 44071 1727204631.02967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204631.05211: done with get_vars() 44071 1727204631.05253: done getting variables 44071 1727204631.05324: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:51 -0400 (0:00:00.066) 0:00:43.370 ***** 44071 1727204631.05367: entering _queue_task() for managed-node2/copy 44071 1727204631.06003: worker is 1 (out of 1 available) 44071 1727204631.06018: exiting _queue_task() for managed-node2/copy 44071 1727204631.06035: done queuing things up, now waiting for results queue to drain 44071 1727204631.06037: waiting for pending results... 44071 1727204631.06284: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204631.06371: in run() - task 127b8e07-fff9-c964-7471-000000000b42 44071 1727204631.06401: variable 'ansible_search_path' from source: unknown 44071 1727204631.06412: variable 'ansible_search_path' from source: unknown 44071 1727204631.06468: calling self._execute() 44071 1727204631.06595: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204631.06613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204631.06628: variable 'omit' from source: magic vars 44071 1727204631.07099: variable 'ansible_distribution_major_version' from source: facts 44071 1727204631.07128: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204631.07275: variable 'network_provider' from source: set_fact 44071 1727204631.07286: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204631.07363: when evaluation is False, skipping this task 44071 1727204631.07369: _execute() done 44071 1727204631.07372: dumping result to json 44071 1727204631.07375: done dumping result, returning 44071 1727204631.07378: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-c964-7471-000000000b42] 44071 1727204631.07381: sending task result for task 127b8e07-fff9-c964-7471-000000000b42 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44071 1727204631.07643: no more pending results, returning what we have 44071 1727204631.07647: results queue empty 44071 1727204631.07648: checking for any_errors_fatal 44071 1727204631.07656: done checking for any_errors_fatal 44071 1727204631.07657: checking for max_fail_percentage 44071 1727204631.07658: done checking for max_fail_percentage 44071 1727204631.07659: checking to see if all hosts have failed and the running result is not ok 44071 1727204631.07660: done checking to see if all hosts have failed 44071 1727204631.07661: getting the remaining hosts for this loop 44071 1727204631.07662: done getting the remaining hosts for this loop 44071 1727204631.07669: getting the next task for host managed-node2 44071 1727204631.07679: done getting next task for host managed-node2 44071 1727204631.07684: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204631.07690: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204631.07715: getting variables 44071 1727204631.07717: in VariableManager get_vars() 44071 1727204631.07758: Calling all_inventory to load vars for managed-node2 44071 1727204631.07761: Calling groups_inventory to load vars for managed-node2 44071 1727204631.07764: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204631.07982: Calling all_plugins_play to load vars for managed-node2 44071 1727204631.07986: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204631.07991: Calling groups_plugins_play to load vars for managed-node2 44071 1727204631.08684: done sending task result for task 127b8e07-fff9-c964-7471-000000000b42 44071 1727204631.08688: WORKER PROCESS EXITING 44071 1727204631.09922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204631.12068: done with get_vars() 44071 1727204631.12101: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:51 -0400 (0:00:00.068) 0:00:43.438 ***** 44071 1727204631.12177: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204631.12470: worker is 1 (out of 1 available) 44071 1727204631.12488: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204631.12502: done queuing things up, now waiting for results queue to drain 44071 1727204631.12504: waiting for pending results... 44071 1727204631.12824: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204631.12941: in run() - task 127b8e07-fff9-c964-7471-000000000b43 44071 1727204631.12958: variable 'ansible_search_path' from source: unknown 44071 1727204631.12961: variable 'ansible_search_path' from source: unknown 44071 1727204631.13123: calling self._execute() 44071 1727204631.13156: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204631.13162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204631.13172: variable 'omit' from source: magic vars 44071 1727204631.13526: variable 'ansible_distribution_major_version' from source: facts 44071 1727204631.13537: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204631.13545: variable 'omit' from source: magic vars 44071 1727204631.13596: variable 'omit' from source: magic vars 44071 1727204631.13740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204631.16877: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204631.16881: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204631.16884: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204631.16886: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204631.16889: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204631.16892: variable 'network_provider' from source: set_fact 44071 1727204631.17024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204631.17053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204631.17081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204631.17131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204631.17149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204631.17252: variable 'omit' from source: magic vars 44071 1727204631.17383: variable 'omit' from source: magic vars 44071 1727204631.17505: variable 'network_connections' from source: include params 44071 1727204631.17518: variable 'interface' from source: play vars 44071 1727204631.17594: variable 'interface' from source: play vars 44071 1727204631.17789: variable 'omit' from source: magic vars 44071 1727204631.17803: variable '__lsr_ansible_managed' from source: task vars 44071 1727204631.17864: variable '__lsr_ansible_managed' from source: task vars 44071 1727204631.18481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44071 1727204631.18897: Loaded config def from plugin (lookup/template) 44071 1727204631.18935: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44071 1727204631.18955: File lookup term: get_ansible_managed.j2 44071 1727204631.18963: variable 'ansible_search_path' from source: unknown 44071 1727204631.19043: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44071 1727204631.19050: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44071 1727204631.19053: variable 'ansible_search_path' from source: unknown 44071 1727204631.29437: variable 'ansible_managed' from source: unknown 44071 1727204631.30075: variable 'omit' from source: magic vars 44071 1727204631.30079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204631.30082: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204631.30202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204631.30213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204631.30228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204631.30263: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204631.30316: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204631.30326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204631.30556: Set connection var ansible_connection to ssh 44071 1727204631.30572: Set connection var ansible_timeout to 10 44071 1727204631.30602: Set connection var ansible_pipelining to False 44071 1727204631.30646: Set connection var ansible_shell_type to sh 44071 1727204631.30746: Set connection var ansible_shell_executable to /bin/sh 44071 1727204631.30749: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204631.30752: variable 'ansible_shell_executable' from source: unknown 44071 1727204631.30754: variable 'ansible_connection' from source: unknown 44071 1727204631.30756: variable 'ansible_module_compression' from source: unknown 44071 1727204631.30758: variable 'ansible_shell_type' from source: unknown 44071 1727204631.30760: variable 'ansible_shell_executable' from source: unknown 44071 1727204631.30763: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204631.30768: variable 'ansible_pipelining' from source: unknown 44071 1727204631.30770: variable 'ansible_timeout' from source: unknown 44071 1727204631.30772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204631.31105: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204631.31307: variable 'omit' from source: magic vars 44071 1727204631.31310: starting attempt loop 44071 1727204631.31313: running the handler 44071 1727204631.31316: _low_level_execute_command(): starting 44071 1727204631.31318: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204631.32841: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204631.33021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204631.33069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204631.33228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204631.33283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204631.33317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204631.33462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204631.35257: stdout chunk (state=3): >>>/root <<< 44071 1727204631.35367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204631.35673: stderr chunk (state=3): >>><<< 44071 1727204631.35677: stdout chunk (state=3): >>><<< 44071 1727204631.35680: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204631.35683: _low_level_execute_command(): starting 44071 1727204631.35686: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204631.3561397-46594-265140441177056 `" && echo ansible-tmp-1727204631.3561397-46594-265140441177056="` echo /root/.ansible/tmp/ansible-tmp-1727204631.3561397-46594-265140441177056 `" ) && sleep 0' 44071 1727204631.37083: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204631.37264: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204631.37272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204631.37275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204631.37328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204631.37405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204631.39410: stdout chunk (state=3): >>>ansible-tmp-1727204631.3561397-46594-265140441177056=/root/.ansible/tmp/ansible-tmp-1727204631.3561397-46594-265140441177056 <<< 44071 1727204631.39558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204631.39972: stderr chunk (state=3): >>><<< 44071 1727204631.39978: stdout chunk (state=3): >>><<< 44071 1727204631.39982: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204631.3561397-46594-265140441177056=/root/.ansible/tmp/ansible-tmp-1727204631.3561397-46594-265140441177056 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204631.39986: variable 'ansible_module_compression' from source: unknown 44071 1727204631.39988: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44071 1727204631.39991: variable 'ansible_facts' from source: unknown 44071 1727204631.40173: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204631.3561397-46594-265140441177056/AnsiballZ_network_connections.py 44071 1727204631.40292: Sending initial data 44071 1727204631.40296: Sent initial data (168 bytes) 44071 1727204631.41097: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204631.41130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204631.41143: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204631.41219: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204631.41224: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204631.41226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204631.41237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204631.41456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204631.41714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204631.41791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204631.43470: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204631.43476: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 44071 1727204631.43484: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 44071 1727204631.43491: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 44071 1727204631.43498: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 44071 1727204631.43508: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 44071 1727204631.43511: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 44071 1727204631.43519: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 44071 1727204631.43525: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44071 1727204631.43540: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204631.43862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204631.43868: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpvh0e_qmq /root/.ansible/tmp/ansible-tmp-1727204631.3561397-46594-265140441177056/AnsiballZ_network_connections.py <<< 44071 1727204631.43871: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204631.3561397-46594-265140441177056/AnsiballZ_network_connections.py" <<< 44071 1727204631.44025: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpvh0e_qmq" to remote "/root/.ansible/tmp/ansible-tmp-1727204631.3561397-46594-265140441177056/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204631.3561397-46594-265140441177056/AnsiballZ_network_connections.py" <<< 44071 1727204631.47076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204631.47082: stderr chunk (state=3): >>><<< 44071 1727204631.47084: stdout chunk (state=3): >>><<< 44071 1727204631.47086: done transferring module to remote 44071 1727204631.47093: _low_level_execute_command(): starting 44071 1727204631.47096: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204631.3561397-46594-265140441177056/ /root/.ansible/tmp/ansible-tmp-1727204631.3561397-46594-265140441177056/AnsiballZ_network_connections.py && sleep 0' 44071 1727204631.48376: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204631.48464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204631.48479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204631.48598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204631.48601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204631.48604: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204631.48606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204631.48608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204631.48610: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204631.48612: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204631.48657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204631.48803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204631.48950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204631.50846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204631.51116: stderr chunk (state=3): >>><<< 44071 1727204631.51120: stdout chunk (state=3): >>><<< 44071 1727204631.51123: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204631.51125: _low_level_execute_command(): starting 44071 1727204631.51128: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204631.3561397-46594-265140441177056/AnsiballZ_network_connections.py && sleep 0' 44071 1727204631.52457: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204631.52462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204631.52467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204631.52675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204631.52775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204631.53012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204631.82526: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 8a139112-7ef3-44ae-a404-065d84fc2b3c\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}}<<< 44071 1727204631.82584: stdout chunk (state=3): >>> <<< 44071 1727204631.86187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204631.86191: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 44071 1727204631.86261: stderr chunk (state=3): >>><<< 44071 1727204631.86267: stdout chunk (state=3): >>><<< 44071 1727204631.86276: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 8a139112-7ef3-44ae-a404-065d84fc2b3c\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204631.86310: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204631.3561397-46594-265140441177056/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204631.86317: _low_level_execute_command(): starting 44071 1727204631.86323: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204631.3561397-46594-265140441177056/ > /dev/null 2>&1 && sleep 0' 44071 1727204631.86819: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204631.86824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204631.86827: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204631.86829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204631.86892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204631.86899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204631.86901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204631.87096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204631.89913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204631.89977: stderr chunk (state=3): >>><<< 44071 1727204631.89981: stdout chunk (state=3): >>><<< 44071 1727204631.89997: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204631.90005: handler run complete 44071 1727204631.90034: attempt loop complete, returning result 44071 1727204631.90037: _execute() done 44071 1727204631.90040: dumping result to json 44071 1727204631.90045: done dumping result, returning 44071 1727204631.90054: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-c964-7471-000000000b43] 44071 1727204631.90061: sending task result for task 127b8e07-fff9-c964-7471-000000000b43 44071 1727204631.90177: done sending task result for task 127b8e07-fff9-c964-7471-000000000b43 44071 1727204631.90180: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 8a139112-7ef3-44ae-a404-065d84fc2b3c 44071 1727204631.90317: no more pending results, returning what we have 44071 1727204631.90320: results queue empty 44071 1727204631.90321: checking for any_errors_fatal 44071 1727204631.90326: done checking for any_errors_fatal 44071 1727204631.90327: checking for max_fail_percentage 44071 1727204631.90328: done checking for max_fail_percentage 44071 1727204631.90329: checking to see if all hosts have failed and the running result is not ok 44071 1727204631.90330: done checking to see if all hosts have failed 44071 1727204631.90330: getting the remaining hosts for this loop 44071 1727204631.90332: done getting the remaining hosts for this loop 44071 1727204631.90336: getting the next task for host managed-node2 44071 1727204631.90346: done getting next task for host managed-node2 44071 1727204631.90350: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204631.90355: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204631.90377: getting variables 44071 1727204631.90379: in VariableManager get_vars() 44071 1727204631.90414: Calling all_inventory to load vars for managed-node2 44071 1727204631.90417: Calling groups_inventory to load vars for managed-node2 44071 1727204631.90419: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204631.90429: Calling all_plugins_play to load vars for managed-node2 44071 1727204631.90432: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204631.90435: Calling groups_plugins_play to load vars for managed-node2 44071 1727204631.91736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204631.93324: done with get_vars() 44071 1727204631.93344: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:51 -0400 (0:00:00.812) 0:00:44.250 ***** 44071 1727204631.93444: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204631.93793: worker is 1 (out of 1 available) 44071 1727204631.93809: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204631.93824: done queuing things up, now waiting for results queue to drain 44071 1727204631.93826: waiting for pending results... 44071 1727204631.94125: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204631.94275: in run() - task 127b8e07-fff9-c964-7471-000000000b44 44071 1727204631.94289: variable 'ansible_search_path' from source: unknown 44071 1727204631.94292: variable 'ansible_search_path' from source: unknown 44071 1727204631.94329: calling self._execute() 44071 1727204631.94425: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204631.94432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204631.94478: variable 'omit' from source: magic vars 44071 1727204631.94856: variable 'ansible_distribution_major_version' from source: facts 44071 1727204631.94870: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204631.94989: variable 'network_state' from source: role '' defaults 44071 1727204631.94995: Evaluated conditional (network_state != {}): False 44071 1727204631.94999: when evaluation is False, skipping this task 44071 1727204631.95002: _execute() done 44071 1727204631.95004: dumping result to json 44071 1727204631.95007: done dumping result, returning 44071 1727204631.95017: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-c964-7471-000000000b44] 44071 1727204631.95020: sending task result for task 127b8e07-fff9-c964-7471-000000000b44 44071 1727204631.95146: done sending task result for task 127b8e07-fff9-c964-7471-000000000b44 44071 1727204631.95149: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204631.95234: no more pending results, returning what we have 44071 1727204631.95237: results queue empty 44071 1727204631.95238: checking for any_errors_fatal 44071 1727204631.95250: done checking for any_errors_fatal 44071 1727204631.95251: checking for max_fail_percentage 44071 1727204631.95253: done checking for max_fail_percentage 44071 1727204631.95253: checking to see if all hosts have failed and the running result is not ok 44071 1727204631.95254: done checking to see if all hosts have failed 44071 1727204631.95255: getting the remaining hosts for this loop 44071 1727204631.95256: done getting the remaining hosts for this loop 44071 1727204631.95265: getting the next task for host managed-node2 44071 1727204631.95275: done getting next task for host managed-node2 44071 1727204631.95280: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204631.95285: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204631.95309: getting variables 44071 1727204631.95311: in VariableManager get_vars() 44071 1727204631.95347: Calling all_inventory to load vars for managed-node2 44071 1727204631.95350: Calling groups_inventory to load vars for managed-node2 44071 1727204631.95352: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204631.95362: Calling all_plugins_play to load vars for managed-node2 44071 1727204631.95365: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204631.95427: Calling groups_plugins_play to load vars for managed-node2 44071 1727204631.96705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204631.98340: done with get_vars() 44071 1727204631.98374: done getting variables 44071 1727204631.98440: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:51 -0400 (0:00:00.050) 0:00:44.301 ***** 44071 1727204631.98473: entering _queue_task() for managed-node2/debug 44071 1727204631.98769: worker is 1 (out of 1 available) 44071 1727204631.98790: exiting _queue_task() for managed-node2/debug 44071 1727204631.98804: done queuing things up, now waiting for results queue to drain 44071 1727204631.98805: waiting for pending results... 44071 1727204631.99145: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204631.99214: in run() - task 127b8e07-fff9-c964-7471-000000000b45 44071 1727204631.99471: variable 'ansible_search_path' from source: unknown 44071 1727204631.99474: variable 'ansible_search_path' from source: unknown 44071 1727204631.99479: calling self._execute() 44071 1727204631.99484: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204631.99488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204631.99494: variable 'omit' from source: magic vars 44071 1727204631.99915: variable 'ansible_distribution_major_version' from source: facts 44071 1727204631.99919: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204631.99922: variable 'omit' from source: magic vars 44071 1727204632.00073: variable 'omit' from source: magic vars 44071 1727204632.00095: variable 'omit' from source: magic vars 44071 1727204632.00173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204632.00224: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204632.00271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204632.00429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204632.00432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204632.00435: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204632.00438: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204632.00440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204632.00493: Set connection var ansible_connection to ssh 44071 1727204632.00499: Set connection var ansible_timeout to 10 44071 1727204632.00505: Set connection var ansible_pipelining to False 44071 1727204632.00510: Set connection var ansible_shell_type to sh 44071 1727204632.00516: Set connection var ansible_shell_executable to /bin/sh 44071 1727204632.00671: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204632.00677: variable 'ansible_shell_executable' from source: unknown 44071 1727204632.00683: variable 'ansible_connection' from source: unknown 44071 1727204632.00686: variable 'ansible_module_compression' from source: unknown 44071 1727204632.00689: variable 'ansible_shell_type' from source: unknown 44071 1727204632.00691: variable 'ansible_shell_executable' from source: unknown 44071 1727204632.00694: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204632.00696: variable 'ansible_pipelining' from source: unknown 44071 1727204632.00698: variable 'ansible_timeout' from source: unknown 44071 1727204632.00701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204632.00835: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204632.00859: variable 'omit' from source: magic vars 44071 1727204632.00881: starting attempt loop 44071 1727204632.00891: running the handler 44071 1727204632.01044: variable '__network_connections_result' from source: set_fact 44071 1727204632.01095: handler run complete 44071 1727204632.01109: attempt loop complete, returning result 44071 1727204632.01113: _execute() done 44071 1727204632.01115: dumping result to json 44071 1727204632.01118: done dumping result, returning 44071 1727204632.01128: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-c964-7471-000000000b45] 44071 1727204632.01133: sending task result for task 127b8e07-fff9-c964-7471-000000000b45 44071 1727204632.01232: done sending task result for task 127b8e07-fff9-c964-7471-000000000b45 44071 1727204632.01235: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 8a139112-7ef3-44ae-a404-065d84fc2b3c" ] } 44071 1727204632.01311: no more pending results, returning what we have 44071 1727204632.01314: results queue empty 44071 1727204632.01315: checking for any_errors_fatal 44071 1727204632.01321: done checking for any_errors_fatal 44071 1727204632.01322: checking for max_fail_percentage 44071 1727204632.01323: done checking for max_fail_percentage 44071 1727204632.01324: checking to see if all hosts have failed and the running result is not ok 44071 1727204632.01325: done checking to see if all hosts have failed 44071 1727204632.01325: getting the remaining hosts for this loop 44071 1727204632.01327: done getting the remaining hosts for this loop 44071 1727204632.01332: getting the next task for host managed-node2 44071 1727204632.01339: done getting next task for host managed-node2 44071 1727204632.01343: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204632.01358: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204632.01373: getting variables 44071 1727204632.01375: in VariableManager get_vars() 44071 1727204632.01412: Calling all_inventory to load vars for managed-node2 44071 1727204632.01414: Calling groups_inventory to load vars for managed-node2 44071 1727204632.01416: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204632.01427: Calling all_plugins_play to load vars for managed-node2 44071 1727204632.01429: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204632.01432: Calling groups_plugins_play to load vars for managed-node2 44071 1727204632.03593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204632.04864: done with get_vars() 44071 1727204632.04897: done getting variables 44071 1727204632.04953: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.065) 0:00:44.366 ***** 44071 1727204632.04991: entering _queue_task() for managed-node2/debug 44071 1727204632.05289: worker is 1 (out of 1 available) 44071 1727204632.05305: exiting _queue_task() for managed-node2/debug 44071 1727204632.05319: done queuing things up, now waiting for results queue to drain 44071 1727204632.05321: waiting for pending results... 44071 1727204632.05532: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204632.05655: in run() - task 127b8e07-fff9-c964-7471-000000000b46 44071 1727204632.05672: variable 'ansible_search_path' from source: unknown 44071 1727204632.05678: variable 'ansible_search_path' from source: unknown 44071 1727204632.05709: calling self._execute() 44071 1727204632.05794: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204632.05800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204632.05808: variable 'omit' from source: magic vars 44071 1727204632.06216: variable 'ansible_distribution_major_version' from source: facts 44071 1727204632.06220: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204632.06223: variable 'omit' from source: magic vars 44071 1727204632.06276: variable 'omit' from source: magic vars 44071 1727204632.06578: variable 'omit' from source: magic vars 44071 1727204632.06582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204632.06585: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204632.06589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204632.06591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204632.06593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204632.06596: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204632.06598: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204632.06600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204632.06603: Set connection var ansible_connection to ssh 44071 1727204632.06605: Set connection var ansible_timeout to 10 44071 1727204632.06607: Set connection var ansible_pipelining to False 44071 1727204632.06610: Set connection var ansible_shell_type to sh 44071 1727204632.06612: Set connection var ansible_shell_executable to /bin/sh 44071 1727204632.06614: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204632.06686: variable 'ansible_shell_executable' from source: unknown 44071 1727204632.06689: variable 'ansible_connection' from source: unknown 44071 1727204632.06692: variable 'ansible_module_compression' from source: unknown 44071 1727204632.06695: variable 'ansible_shell_type' from source: unknown 44071 1727204632.06697: variable 'ansible_shell_executable' from source: unknown 44071 1727204632.06700: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204632.06702: variable 'ansible_pipelining' from source: unknown 44071 1727204632.06704: variable 'ansible_timeout' from source: unknown 44071 1727204632.06706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204632.06817: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204632.06935: variable 'omit' from source: magic vars 44071 1727204632.06939: starting attempt loop 44071 1727204632.06944: running the handler 44071 1727204632.06946: variable '__network_connections_result' from source: set_fact 44071 1727204632.06972: variable '__network_connections_result' from source: set_fact 44071 1727204632.07164: handler run complete 44071 1727204632.07169: attempt loop complete, returning result 44071 1727204632.07172: _execute() done 44071 1727204632.07175: dumping result to json 44071 1727204632.07177: done dumping result, returning 44071 1727204632.07180: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-c964-7471-000000000b46] 44071 1727204632.07183: sending task result for task 127b8e07-fff9-c964-7471-000000000b46 44071 1727204632.07383: done sending task result for task 127b8e07-fff9-c964-7471-000000000b46 44071 1727204632.07387: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 8a139112-7ef3-44ae-a404-065d84fc2b3c\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 8a139112-7ef3-44ae-a404-065d84fc2b3c" ] } } 44071 1727204632.07579: no more pending results, returning what we have 44071 1727204632.07582: results queue empty 44071 1727204632.07583: checking for any_errors_fatal 44071 1727204632.07588: done checking for any_errors_fatal 44071 1727204632.07589: checking for max_fail_percentage 44071 1727204632.07590: done checking for max_fail_percentage 44071 1727204632.07591: checking to see if all hosts have failed and the running result is not ok 44071 1727204632.07591: done checking to see if all hosts have failed 44071 1727204632.07596: getting the remaining hosts for this loop 44071 1727204632.07597: done getting the remaining hosts for this loop 44071 1727204632.07601: getting the next task for host managed-node2 44071 1727204632.07608: done getting next task for host managed-node2 44071 1727204632.07611: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204632.07616: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204632.07629: getting variables 44071 1727204632.07631: in VariableManager get_vars() 44071 1727204632.07682: Calling all_inventory to load vars for managed-node2 44071 1727204632.07685: Calling groups_inventory to load vars for managed-node2 44071 1727204632.07688: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204632.07698: Calling all_plugins_play to load vars for managed-node2 44071 1727204632.07705: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204632.07709: Calling groups_plugins_play to load vars for managed-node2 44071 1727204632.09732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204632.12635: done with get_vars() 44071 1727204632.12675: done getting variables 44071 1727204632.12792: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.078) 0:00:44.444 ***** 44071 1727204632.12838: entering _queue_task() for managed-node2/debug 44071 1727204632.13328: worker is 1 (out of 1 available) 44071 1727204632.13347: exiting _queue_task() for managed-node2/debug 44071 1727204632.13363: done queuing things up, now waiting for results queue to drain 44071 1727204632.13364: waiting for pending results... 44071 1727204632.13864: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204632.13872: in run() - task 127b8e07-fff9-c964-7471-000000000b47 44071 1727204632.14083: variable 'ansible_search_path' from source: unknown 44071 1727204632.14087: variable 'ansible_search_path' from source: unknown 44071 1727204632.14173: calling self._execute() 44071 1727204632.14299: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204632.14319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204632.14335: variable 'omit' from source: magic vars 44071 1727204632.14814: variable 'ansible_distribution_major_version' from source: facts 44071 1727204632.14844: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204632.14986: variable 'network_state' from source: role '' defaults 44071 1727204632.15003: Evaluated conditional (network_state != {}): False 44071 1727204632.15011: when evaluation is False, skipping this task 44071 1727204632.15018: _execute() done 44071 1727204632.15026: dumping result to json 44071 1727204632.15056: done dumping result, returning 44071 1727204632.15060: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-c964-7471-000000000b47] 44071 1727204632.15076: sending task result for task 127b8e07-fff9-c964-7471-000000000b47 44071 1727204632.15277: done sending task result for task 127b8e07-fff9-c964-7471-000000000b47 44071 1727204632.15281: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 44071 1727204632.15339: no more pending results, returning what we have 44071 1727204632.15346: results queue empty 44071 1727204632.15348: checking for any_errors_fatal 44071 1727204632.15360: done checking for any_errors_fatal 44071 1727204632.15361: checking for max_fail_percentage 44071 1727204632.15363: done checking for max_fail_percentage 44071 1727204632.15364: checking to see if all hosts have failed and the running result is not ok 44071 1727204632.15367: done checking to see if all hosts have failed 44071 1727204632.15368: getting the remaining hosts for this loop 44071 1727204632.15370: done getting the remaining hosts for this loop 44071 1727204632.15376: getting the next task for host managed-node2 44071 1727204632.15386: done getting next task for host managed-node2 44071 1727204632.15391: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204632.15397: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204632.15424: getting variables 44071 1727204632.15426: in VariableManager get_vars() 44071 1727204632.15711: Calling all_inventory to load vars for managed-node2 44071 1727204632.15715: Calling groups_inventory to load vars for managed-node2 44071 1727204632.15718: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204632.15732: Calling all_plugins_play to load vars for managed-node2 44071 1727204632.15736: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204632.15740: Calling groups_plugins_play to load vars for managed-node2 44071 1727204632.17901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204632.20508: done with get_vars() 44071 1727204632.20555: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.078) 0:00:44.523 ***** 44071 1727204632.20682: entering _queue_task() for managed-node2/ping 44071 1727204632.21125: worker is 1 (out of 1 available) 44071 1727204632.21151: exiting _queue_task() for managed-node2/ping 44071 1727204632.21169: done queuing things up, now waiting for results queue to drain 44071 1727204632.21171: waiting for pending results... 44071 1727204632.21703: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204632.21710: in run() - task 127b8e07-fff9-c964-7471-000000000b48 44071 1727204632.21713: variable 'ansible_search_path' from source: unknown 44071 1727204632.21716: variable 'ansible_search_path' from source: unknown 44071 1727204632.21719: calling self._execute() 44071 1727204632.21906: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204632.21910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204632.21914: variable 'omit' from source: magic vars 44071 1727204632.22245: variable 'ansible_distribution_major_version' from source: facts 44071 1727204632.22255: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204632.22262: variable 'omit' from source: magic vars 44071 1727204632.22343: variable 'omit' from source: magic vars 44071 1727204632.22383: variable 'omit' from source: magic vars 44071 1727204632.22427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204632.22469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204632.22491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204632.22511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204632.22524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204632.22557: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204632.22560: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204632.22564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204632.22678: Set connection var ansible_connection to ssh 44071 1727204632.22684: Set connection var ansible_timeout to 10 44071 1727204632.22690: Set connection var ansible_pipelining to False 44071 1727204632.22696: Set connection var ansible_shell_type to sh 44071 1727204632.22707: Set connection var ansible_shell_executable to /bin/sh 44071 1727204632.22710: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204632.22735: variable 'ansible_shell_executable' from source: unknown 44071 1727204632.22738: variable 'ansible_connection' from source: unknown 44071 1727204632.22744: variable 'ansible_module_compression' from source: unknown 44071 1727204632.22747: variable 'ansible_shell_type' from source: unknown 44071 1727204632.22749: variable 'ansible_shell_executable' from source: unknown 44071 1727204632.22752: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204632.22754: variable 'ansible_pipelining' from source: unknown 44071 1727204632.22756: variable 'ansible_timeout' from source: unknown 44071 1727204632.22759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204632.22993: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204632.23003: variable 'omit' from source: magic vars 44071 1727204632.23005: starting attempt loop 44071 1727204632.23008: running the handler 44071 1727204632.23010: _low_level_execute_command(): starting 44071 1727204632.23012: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204632.23898: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204632.24006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204632.24010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204632.24100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204632.25880: stdout chunk (state=3): >>>/root <<< 44071 1727204632.26099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204632.26103: stdout chunk (state=3): >>><<< 44071 1727204632.26106: stderr chunk (state=3): >>><<< 44071 1727204632.26253: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204632.26258: _low_level_execute_command(): starting 44071 1727204632.26261: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204632.2614117-46638-11044220219154 `" && echo ansible-tmp-1727204632.2614117-46638-11044220219154="` echo /root/.ansible/tmp/ansible-tmp-1727204632.2614117-46638-11044220219154 `" ) && sleep 0' 44071 1727204632.27278: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204632.27319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204632.27436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204632.29430: stdout chunk (state=3): >>>ansible-tmp-1727204632.2614117-46638-11044220219154=/root/.ansible/tmp/ansible-tmp-1727204632.2614117-46638-11044220219154 <<< 44071 1727204632.29672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204632.29676: stdout chunk (state=3): >>><<< 44071 1727204632.29679: stderr chunk (state=3): >>><<< 44071 1727204632.29701: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204632.2614117-46638-11044220219154=/root/.ansible/tmp/ansible-tmp-1727204632.2614117-46638-11044220219154 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204632.29784: variable 'ansible_module_compression' from source: unknown 44071 1727204632.30055: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44071 1727204632.30060: variable 'ansible_facts' from source: unknown 44071 1727204632.30062: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204632.2614117-46638-11044220219154/AnsiballZ_ping.py 44071 1727204632.30100: Sending initial data 44071 1727204632.30111: Sent initial data (152 bytes) 44071 1727204632.30776: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204632.30873: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204632.30936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204632.31036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204632.32682: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204632.32789: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204632.32863: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpd8dk365t /root/.ansible/tmp/ansible-tmp-1727204632.2614117-46638-11044220219154/AnsiballZ_ping.py <<< 44071 1727204632.32869: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204632.2614117-46638-11044220219154/AnsiballZ_ping.py" <<< 44071 1727204632.32926: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpd8dk365t" to remote "/root/.ansible/tmp/ansible-tmp-1727204632.2614117-46638-11044220219154/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204632.2614117-46638-11044220219154/AnsiballZ_ping.py" <<< 44071 1727204632.33904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204632.33972: stderr chunk (state=3): >>><<< 44071 1727204632.33977: stdout chunk (state=3): >>><<< 44071 1727204632.33995: done transferring module to remote 44071 1727204632.34073: _low_level_execute_command(): starting 44071 1727204632.34076: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204632.2614117-46638-11044220219154/ /root/.ansible/tmp/ansible-tmp-1727204632.2614117-46638-11044220219154/AnsiballZ_ping.py && sleep 0' 44071 1727204632.34813: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204632.34886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204632.34892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204632.34962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204632.35000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204632.35030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204632.35057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204632.35280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204632.36992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204632.37124: stderr chunk (state=3): >>><<< 44071 1727204632.37128: stdout chunk (state=3): >>><<< 44071 1727204632.37245: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204632.37255: _low_level_execute_command(): starting 44071 1727204632.37257: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204632.2614117-46638-11044220219154/AnsiballZ_ping.py && sleep 0' 44071 1727204632.37833: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204632.37885: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204632.37907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204632.37953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204632.38041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204632.59074: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44071 1727204632.60190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204632.60254: stderr chunk (state=3): >>><<< 44071 1727204632.60258: stdout chunk (state=3): >>><<< 44071 1727204632.60292: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204632.60333: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204632.2614117-46638-11044220219154/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204632.60342: _low_level_execute_command(): starting 44071 1727204632.60351: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204632.2614117-46638-11044220219154/ > /dev/null 2>&1 && sleep 0' 44071 1727204632.61104: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204632.61111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204632.61123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204632.61158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204632.61173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204632.61257: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204632.61297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204632.61368: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204632.61372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204632.61435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204632.63482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204632.63509: stderr chunk (state=3): >>><<< 44071 1727204632.63518: stdout chunk (state=3): >>><<< 44071 1727204632.63546: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204632.63671: handler run complete 44071 1727204632.63674: attempt loop complete, returning result 44071 1727204632.63677: _execute() done 44071 1727204632.63678: dumping result to json 44071 1727204632.63680: done dumping result, returning 44071 1727204632.63682: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-c964-7471-000000000b48] 44071 1727204632.63684: sending task result for task 127b8e07-fff9-c964-7471-000000000b48 44071 1727204632.63758: done sending task result for task 127b8e07-fff9-c964-7471-000000000b48 44071 1727204632.63777: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 44071 1727204632.63855: no more pending results, returning what we have 44071 1727204632.63858: results queue empty 44071 1727204632.63859: checking for any_errors_fatal 44071 1727204632.63868: done checking for any_errors_fatal 44071 1727204632.63869: checking for max_fail_percentage 44071 1727204632.63871: done checking for max_fail_percentage 44071 1727204632.63872: checking to see if all hosts have failed and the running result is not ok 44071 1727204632.63873: done checking to see if all hosts have failed 44071 1727204632.63873: getting the remaining hosts for this loop 44071 1727204632.63875: done getting the remaining hosts for this loop 44071 1727204632.64074: getting the next task for host managed-node2 44071 1727204632.64086: done getting next task for host managed-node2 44071 1727204632.64089: ^ task is: TASK: meta (role_complete) 44071 1727204632.64094: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204632.64108: getting variables 44071 1727204632.64110: in VariableManager get_vars() 44071 1727204632.64159: Calling all_inventory to load vars for managed-node2 44071 1727204632.64163: Calling groups_inventory to load vars for managed-node2 44071 1727204632.64176: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204632.64191: Calling all_plugins_play to load vars for managed-node2 44071 1727204632.64195: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204632.64199: Calling groups_plugins_play to load vars for managed-node2 44071 1727204632.66630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204632.69172: done with get_vars() 44071 1727204632.69226: done getting variables 44071 1727204632.69340: done queuing things up, now waiting for results queue to drain 44071 1727204632.69345: results queue empty 44071 1727204632.69346: checking for any_errors_fatal 44071 1727204632.69350: done checking for any_errors_fatal 44071 1727204632.69350: checking for max_fail_percentage 44071 1727204632.69352: done checking for max_fail_percentage 44071 1727204632.69353: checking to see if all hosts have failed and the running result is not ok 44071 1727204632.69354: done checking to see if all hosts have failed 44071 1727204632.69354: getting the remaining hosts for this loop 44071 1727204632.69355: done getting the remaining hosts for this loop 44071 1727204632.69358: getting the next task for host managed-node2 44071 1727204632.69364: done getting next task for host managed-node2 44071 1727204632.69368: ^ task is: TASK: Show result 44071 1727204632.69371: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204632.69373: getting variables 44071 1727204632.69375: in VariableManager get_vars() 44071 1727204632.69387: Calling all_inventory to load vars for managed-node2 44071 1727204632.69389: Calling groups_inventory to load vars for managed-node2 44071 1727204632.69392: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204632.69402: Calling all_plugins_play to load vars for managed-node2 44071 1727204632.69411: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204632.69415: Calling groups_plugins_play to load vars for managed-node2 44071 1727204632.71149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204632.73427: done with get_vars() 44071 1727204632.73459: done getting variables 44071 1727204632.73515: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.528) 0:00:45.051 ***** 44071 1727204632.73549: entering _queue_task() for managed-node2/debug 44071 1727204632.73942: worker is 1 (out of 1 available) 44071 1727204632.73957: exiting _queue_task() for managed-node2/debug 44071 1727204632.73974: done queuing things up, now waiting for results queue to drain 44071 1727204632.73976: waiting for pending results... 44071 1727204632.74300: running TaskExecutor() for managed-node2/TASK: Show result 44071 1727204632.74479: in run() - task 127b8e07-fff9-c964-7471-000000000ad2 44071 1727204632.74485: variable 'ansible_search_path' from source: unknown 44071 1727204632.74489: variable 'ansible_search_path' from source: unknown 44071 1727204632.74497: calling self._execute() 44071 1727204632.74595: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204632.74601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204632.74617: variable 'omit' from source: magic vars 44071 1727204632.75053: variable 'ansible_distribution_major_version' from source: facts 44071 1727204632.75069: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204632.75129: variable 'omit' from source: magic vars 44071 1727204632.75133: variable 'omit' from source: magic vars 44071 1727204632.75178: variable 'omit' from source: magic vars 44071 1727204632.75224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204632.75269: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204632.75292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204632.75312: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204632.75349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204632.75355: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204632.75358: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204632.75363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204632.75671: Set connection var ansible_connection to ssh 44071 1727204632.75675: Set connection var ansible_timeout to 10 44071 1727204632.75677: Set connection var ansible_pipelining to False 44071 1727204632.75680: Set connection var ansible_shell_type to sh 44071 1727204632.75682: Set connection var ansible_shell_executable to /bin/sh 44071 1727204632.75684: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204632.75686: variable 'ansible_shell_executable' from source: unknown 44071 1727204632.75689: variable 'ansible_connection' from source: unknown 44071 1727204632.75691: variable 'ansible_module_compression' from source: unknown 44071 1727204632.75694: variable 'ansible_shell_type' from source: unknown 44071 1727204632.75697: variable 'ansible_shell_executable' from source: unknown 44071 1727204632.75699: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204632.75702: variable 'ansible_pipelining' from source: unknown 44071 1727204632.75705: variable 'ansible_timeout' from source: unknown 44071 1727204632.75708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204632.75726: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204632.75738: variable 'omit' from source: magic vars 44071 1727204632.75745: starting attempt loop 44071 1727204632.75749: running the handler 44071 1727204632.75798: variable '__network_connections_result' from source: set_fact 44071 1727204632.75893: variable '__network_connections_result' from source: set_fact 44071 1727204632.76027: handler run complete 44071 1727204632.76062: attempt loop complete, returning result 44071 1727204632.76151: _execute() done 44071 1727204632.76159: dumping result to json 44071 1727204632.76161: done dumping result, returning 44071 1727204632.76164: done running TaskExecutor() for managed-node2/TASK: Show result [127b8e07-fff9-c964-7471-000000000ad2] 44071 1727204632.76167: sending task result for task 127b8e07-fff9-c964-7471-000000000ad2 44071 1727204632.76239: done sending task result for task 127b8e07-fff9-c964-7471-000000000ad2 44071 1727204632.76244: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 8a139112-7ef3-44ae-a404-065d84fc2b3c\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 8a139112-7ef3-44ae-a404-065d84fc2b3c" ] } } 44071 1727204632.76349: no more pending results, returning what we have 44071 1727204632.76353: results queue empty 44071 1727204632.76354: checking for any_errors_fatal 44071 1727204632.76357: done checking for any_errors_fatal 44071 1727204632.76357: checking for max_fail_percentage 44071 1727204632.76359: done checking for max_fail_percentage 44071 1727204632.76360: checking to see if all hosts have failed and the running result is not ok 44071 1727204632.76361: done checking to see if all hosts have failed 44071 1727204632.76362: getting the remaining hosts for this loop 44071 1727204632.76364: done getting the remaining hosts for this loop 44071 1727204632.76371: getting the next task for host managed-node2 44071 1727204632.76382: done getting next task for host managed-node2 44071 1727204632.76386: ^ task is: TASK: Test 44071 1727204632.76389: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204632.76395: getting variables 44071 1727204632.76396: in VariableManager get_vars() 44071 1727204632.76434: Calling all_inventory to load vars for managed-node2 44071 1727204632.76436: Calling groups_inventory to load vars for managed-node2 44071 1727204632.76441: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204632.76454: Calling all_plugins_play to load vars for managed-node2 44071 1727204632.76458: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204632.76461: Calling groups_plugins_play to load vars for managed-node2 44071 1727204632.78364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204632.80535: done with get_vars() 44071 1727204632.80580: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.071) 0:00:45.123 ***** 44071 1727204632.80688: entering _queue_task() for managed-node2/include_tasks 44071 1727204632.81300: worker is 1 (out of 1 available) 44071 1727204632.81314: exiting _queue_task() for managed-node2/include_tasks 44071 1727204632.81324: done queuing things up, now waiting for results queue to drain 44071 1727204632.81326: waiting for pending results... 44071 1727204632.81486: running TaskExecutor() for managed-node2/TASK: Test 44071 1727204632.81572: in run() - task 127b8e07-fff9-c964-7471-000000000a4d 44071 1727204632.81576: variable 'ansible_search_path' from source: unknown 44071 1727204632.81579: variable 'ansible_search_path' from source: unknown 44071 1727204632.81637: variable 'lsr_test' from source: include params 44071 1727204632.81887: variable 'lsr_test' from source: include params 44071 1727204632.82227: variable 'omit' from source: magic vars 44071 1727204632.82232: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204632.82235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204632.82237: variable 'omit' from source: magic vars 44071 1727204632.82849: variable 'ansible_distribution_major_version' from source: facts 44071 1727204632.82860: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204632.82873: variable 'item' from source: unknown 44071 1727204632.83074: variable 'item' from source: unknown 44071 1727204632.83228: variable 'item' from source: unknown 44071 1727204632.83292: variable 'item' from source: unknown 44071 1727204632.83802: dumping result to json 44071 1727204632.83806: done dumping result, returning 44071 1727204632.83809: done running TaskExecutor() for managed-node2/TASK: Test [127b8e07-fff9-c964-7471-000000000a4d] 44071 1727204632.83811: sending task result for task 127b8e07-fff9-c964-7471-000000000a4d 44071 1727204632.83909: done sending task result for task 127b8e07-fff9-c964-7471-000000000a4d 44071 1727204632.83913: WORKER PROCESS EXITING 44071 1727204632.83994: no more pending results, returning what we have 44071 1727204632.84000: in VariableManager get_vars() 44071 1727204632.84040: Calling all_inventory to load vars for managed-node2 44071 1727204632.84043: Calling groups_inventory to load vars for managed-node2 44071 1727204632.84047: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204632.84061: Calling all_plugins_play to load vars for managed-node2 44071 1727204632.84067: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204632.84072: Calling groups_plugins_play to load vars for managed-node2 44071 1727204632.88191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204632.92143: done with get_vars() 44071 1727204632.92188: variable 'ansible_search_path' from source: unknown 44071 1727204632.92190: variable 'ansible_search_path' from source: unknown 44071 1727204632.92236: we have included files to process 44071 1727204632.92237: generating all_blocks data 44071 1727204632.92240: done generating all_blocks data 44071 1727204632.92246: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 44071 1727204632.92248: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 44071 1727204632.92251: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 44071 1727204632.92451: done processing included file 44071 1727204632.92453: iterating over new_blocks loaded from include file 44071 1727204632.92455: in VariableManager get_vars() 44071 1727204632.92477: done with get_vars() 44071 1727204632.92479: filtering new block on tags 44071 1727204632.92509: done filtering new block on tags 44071 1727204632.92512: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed-node2 => (item=tasks/activate_profile.yml) 44071 1727204632.92518: extending task lists for all hosts with included blocks 44071 1727204632.93377: done extending task lists 44071 1727204632.93379: done processing included files 44071 1727204632.93380: results queue empty 44071 1727204632.93381: checking for any_errors_fatal 44071 1727204632.93387: done checking for any_errors_fatal 44071 1727204632.93388: checking for max_fail_percentage 44071 1727204632.93389: done checking for max_fail_percentage 44071 1727204632.93390: checking to see if all hosts have failed and the running result is not ok 44071 1727204632.93391: done checking to see if all hosts have failed 44071 1727204632.93392: getting the remaining hosts for this loop 44071 1727204632.93393: done getting the remaining hosts for this loop 44071 1727204632.93396: getting the next task for host managed-node2 44071 1727204632.93402: done getting next task for host managed-node2 44071 1727204632.93404: ^ task is: TASK: Include network role 44071 1727204632.93408: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204632.93411: getting variables 44071 1727204632.93412: in VariableManager get_vars() 44071 1727204632.93428: Calling all_inventory to load vars for managed-node2 44071 1727204632.93430: Calling groups_inventory to load vars for managed-node2 44071 1727204632.93433: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204632.93440: Calling all_plugins_play to load vars for managed-node2 44071 1727204632.93442: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204632.93445: Calling groups_plugins_play to load vars for managed-node2 44071 1727204632.95110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204632.97194: done with get_vars() 44071 1727204632.97236: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.166) 0:00:45.289 ***** 44071 1727204632.97343: entering _queue_task() for managed-node2/include_role 44071 1727204632.97746: worker is 1 (out of 1 available) 44071 1727204632.97762: exiting _queue_task() for managed-node2/include_role 44071 1727204632.97777: done queuing things up, now waiting for results queue to drain 44071 1727204632.97779: waiting for pending results... 44071 1727204632.98090: running TaskExecutor() for managed-node2/TASK: Include network role 44071 1727204632.98198: in run() - task 127b8e07-fff9-c964-7471-000000000caa 44071 1727204632.98235: variable 'ansible_search_path' from source: unknown 44071 1727204632.98238: variable 'ansible_search_path' from source: unknown 44071 1727204632.98264: calling self._execute() 44071 1727204632.98371: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204632.98380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204632.98391: variable 'omit' from source: magic vars 44071 1727204632.98828: variable 'ansible_distribution_major_version' from source: facts 44071 1727204632.98844: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204632.98850: _execute() done 44071 1727204632.98863: dumping result to json 44071 1727204632.98867: done dumping result, returning 44071 1727204632.98975: done running TaskExecutor() for managed-node2/TASK: Include network role [127b8e07-fff9-c964-7471-000000000caa] 44071 1727204632.98980: sending task result for task 127b8e07-fff9-c964-7471-000000000caa 44071 1727204632.99073: done sending task result for task 127b8e07-fff9-c964-7471-000000000caa 44071 1727204632.99192: WORKER PROCESS EXITING 44071 1727204632.99221: no more pending results, returning what we have 44071 1727204632.99227: in VariableManager get_vars() 44071 1727204632.99263: Calling all_inventory to load vars for managed-node2 44071 1727204632.99268: Calling groups_inventory to load vars for managed-node2 44071 1727204632.99272: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204632.99284: Calling all_plugins_play to load vars for managed-node2 44071 1727204632.99287: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204632.99290: Calling groups_plugins_play to load vars for managed-node2 44071 1727204633.01051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204633.03186: done with get_vars() 44071 1727204633.03228: variable 'ansible_search_path' from source: unknown 44071 1727204633.03230: variable 'ansible_search_path' from source: unknown 44071 1727204633.03390: variable 'omit' from source: magic vars 44071 1727204633.03437: variable 'omit' from source: magic vars 44071 1727204633.03455: variable 'omit' from source: magic vars 44071 1727204633.03459: we have included files to process 44071 1727204633.03460: generating all_blocks data 44071 1727204633.03462: done generating all_blocks data 44071 1727204633.03464: processing included file: fedora.linux_system_roles.network 44071 1727204633.03489: in VariableManager get_vars() 44071 1727204633.03506: done with get_vars() 44071 1727204633.03538: in VariableManager get_vars() 44071 1727204633.03557: done with get_vars() 44071 1727204633.03603: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44071 1727204633.03742: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44071 1727204633.03834: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44071 1727204633.04444: in VariableManager get_vars() 44071 1727204633.04473: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204633.06546: iterating over new_blocks loaded from include file 44071 1727204633.06549: in VariableManager get_vars() 44071 1727204633.06573: done with get_vars() 44071 1727204633.06576: filtering new block on tags 44071 1727204633.06909: done filtering new block on tags 44071 1727204633.06914: in VariableManager get_vars() 44071 1727204633.06933: done with get_vars() 44071 1727204633.06935: filtering new block on tags 44071 1727204633.06954: done filtering new block on tags 44071 1727204633.06956: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 44071 1727204633.06962: extending task lists for all hosts with included blocks 44071 1727204633.07089: done extending task lists 44071 1727204633.07091: done processing included files 44071 1727204633.07092: results queue empty 44071 1727204633.07093: checking for any_errors_fatal 44071 1727204633.07097: done checking for any_errors_fatal 44071 1727204633.07098: checking for max_fail_percentage 44071 1727204633.07099: done checking for max_fail_percentage 44071 1727204633.07100: checking to see if all hosts have failed and the running result is not ok 44071 1727204633.07101: done checking to see if all hosts have failed 44071 1727204633.07101: getting the remaining hosts for this loop 44071 1727204633.07103: done getting the remaining hosts for this loop 44071 1727204633.07106: getting the next task for host managed-node2 44071 1727204633.07110: done getting next task for host managed-node2 44071 1727204633.07113: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204633.07117: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204633.07130: getting variables 44071 1727204633.07132: in VariableManager get_vars() 44071 1727204633.07147: Calling all_inventory to load vars for managed-node2 44071 1727204633.07150: Calling groups_inventory to load vars for managed-node2 44071 1727204633.07152: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204633.07159: Calling all_plugins_play to load vars for managed-node2 44071 1727204633.07161: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204633.07164: Calling groups_plugins_play to load vars for managed-node2 44071 1727204633.08710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204633.10887: done with get_vars() 44071 1727204633.10928: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:53 -0400 (0:00:00.136) 0:00:45.426 ***** 44071 1727204633.11023: entering _queue_task() for managed-node2/include_tasks 44071 1727204633.11426: worker is 1 (out of 1 available) 44071 1727204633.11443: exiting _queue_task() for managed-node2/include_tasks 44071 1727204633.11459: done queuing things up, now waiting for results queue to drain 44071 1727204633.11461: waiting for pending results... 44071 1727204633.11726: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204633.11912: in run() - task 127b8e07-fff9-c964-7471-000000000d16 44071 1727204633.11916: variable 'ansible_search_path' from source: unknown 44071 1727204633.11919: variable 'ansible_search_path' from source: unknown 44071 1727204633.11931: calling self._execute() 44071 1727204633.12128: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204633.12133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204633.12136: variable 'omit' from source: magic vars 44071 1727204633.12499: variable 'ansible_distribution_major_version' from source: facts 44071 1727204633.12511: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204633.12519: _execute() done 44071 1727204633.12522: dumping result to json 44071 1727204633.12525: done dumping result, returning 44071 1727204633.12536: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-c964-7471-000000000d16] 44071 1727204633.12540: sending task result for task 127b8e07-fff9-c964-7471-000000000d16 44071 1727204633.12654: done sending task result for task 127b8e07-fff9-c964-7471-000000000d16 44071 1727204633.12712: no more pending results, returning what we have 44071 1727204633.12719: in VariableManager get_vars() 44071 1727204633.12777: Calling all_inventory to load vars for managed-node2 44071 1727204633.12781: Calling groups_inventory to load vars for managed-node2 44071 1727204633.12783: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204633.12791: WORKER PROCESS EXITING 44071 1727204633.12977: Calling all_plugins_play to load vars for managed-node2 44071 1727204633.12980: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204633.12984: Calling groups_plugins_play to load vars for managed-node2 44071 1727204633.20927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204633.23104: done with get_vars() 44071 1727204633.23139: variable 'ansible_search_path' from source: unknown 44071 1727204633.23141: variable 'ansible_search_path' from source: unknown 44071 1727204633.23189: we have included files to process 44071 1727204633.23191: generating all_blocks data 44071 1727204633.23192: done generating all_blocks data 44071 1727204633.23195: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204633.23196: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204633.23198: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204633.23803: done processing included file 44071 1727204633.23805: iterating over new_blocks loaded from include file 44071 1727204633.23807: in VariableManager get_vars() 44071 1727204633.23834: done with get_vars() 44071 1727204633.23836: filtering new block on tags 44071 1727204633.23870: done filtering new block on tags 44071 1727204633.23873: in VariableManager get_vars() 44071 1727204633.23896: done with get_vars() 44071 1727204633.23898: filtering new block on tags 44071 1727204633.23945: done filtering new block on tags 44071 1727204633.23948: in VariableManager get_vars() 44071 1727204633.23972: done with get_vars() 44071 1727204633.23974: filtering new block on tags 44071 1727204633.24018: done filtering new block on tags 44071 1727204633.24020: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 44071 1727204633.24026: extending task lists for all hosts with included blocks 44071 1727204633.25920: done extending task lists 44071 1727204633.25923: done processing included files 44071 1727204633.25924: results queue empty 44071 1727204633.25924: checking for any_errors_fatal 44071 1727204633.25928: done checking for any_errors_fatal 44071 1727204633.25929: checking for max_fail_percentage 44071 1727204633.25930: done checking for max_fail_percentage 44071 1727204633.25931: checking to see if all hosts have failed and the running result is not ok 44071 1727204633.25932: done checking to see if all hosts have failed 44071 1727204633.25933: getting the remaining hosts for this loop 44071 1727204633.25934: done getting the remaining hosts for this loop 44071 1727204633.25937: getting the next task for host managed-node2 44071 1727204633.25943: done getting next task for host managed-node2 44071 1727204633.25946: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204633.25950: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204633.25963: getting variables 44071 1727204633.25965: in VariableManager get_vars() 44071 1727204633.25986: Calling all_inventory to load vars for managed-node2 44071 1727204633.25989: Calling groups_inventory to load vars for managed-node2 44071 1727204633.25991: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204633.25997: Calling all_plugins_play to load vars for managed-node2 44071 1727204633.26000: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204633.26004: Calling groups_plugins_play to load vars for managed-node2 44071 1727204633.27634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204633.29803: done with get_vars() 44071 1727204633.29840: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:53 -0400 (0:00:00.189) 0:00:45.615 ***** 44071 1727204633.29932: entering _queue_task() for managed-node2/setup 44071 1727204633.30331: worker is 1 (out of 1 available) 44071 1727204633.30345: exiting _queue_task() for managed-node2/setup 44071 1727204633.30360: done queuing things up, now waiting for results queue to drain 44071 1727204633.30362: waiting for pending results... 44071 1727204633.30698: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204633.30901: in run() - task 127b8e07-fff9-c964-7471-000000000d6d 44071 1727204633.30906: variable 'ansible_search_path' from source: unknown 44071 1727204633.30909: variable 'ansible_search_path' from source: unknown 44071 1727204633.30937: calling self._execute() 44071 1727204633.31073: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204633.31078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204633.31080: variable 'omit' from source: magic vars 44071 1727204633.31497: variable 'ansible_distribution_major_version' from source: facts 44071 1727204633.31510: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204633.31758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204633.34329: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204633.34394: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204633.34443: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204633.34479: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204633.34530: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204633.34605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204633.34640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204633.34663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204633.34713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204633.34729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204633.34790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204633.34819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204633.34858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204633.34967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204633.34973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204633.35093: variable '__network_required_facts' from source: role '' defaults 44071 1727204633.35104: variable 'ansible_facts' from source: unknown 44071 1727204633.36099: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44071 1727204633.36104: when evaluation is False, skipping this task 44071 1727204633.36107: _execute() done 44071 1727204633.36169: dumping result to json 44071 1727204633.36175: done dumping result, returning 44071 1727204633.36183: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-c964-7471-000000000d6d] 44071 1727204633.36185: sending task result for task 127b8e07-fff9-c964-7471-000000000d6d 44071 1727204633.36373: done sending task result for task 127b8e07-fff9-c964-7471-000000000d6d 44071 1727204633.36378: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204633.36430: no more pending results, returning what we have 44071 1727204633.36434: results queue empty 44071 1727204633.36435: checking for any_errors_fatal 44071 1727204633.36438: done checking for any_errors_fatal 44071 1727204633.36438: checking for max_fail_percentage 44071 1727204633.36440: done checking for max_fail_percentage 44071 1727204633.36441: checking to see if all hosts have failed and the running result is not ok 44071 1727204633.36442: done checking to see if all hosts have failed 44071 1727204633.36442: getting the remaining hosts for this loop 44071 1727204633.36444: done getting the remaining hosts for this loop 44071 1727204633.36449: getting the next task for host managed-node2 44071 1727204633.36463: done getting next task for host managed-node2 44071 1727204633.36470: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204633.36477: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204633.36498: getting variables 44071 1727204633.36499: in VariableManager get_vars() 44071 1727204633.36542: Calling all_inventory to load vars for managed-node2 44071 1727204633.36545: Calling groups_inventory to load vars for managed-node2 44071 1727204633.36547: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204633.36559: Calling all_plugins_play to load vars for managed-node2 44071 1727204633.36563: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204633.36746: Calling groups_plugins_play to load vars for managed-node2 44071 1727204633.38462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204633.40646: done with get_vars() 44071 1727204633.40687: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:53 -0400 (0:00:00.108) 0:00:45.724 ***** 44071 1727204633.40803: entering _queue_task() for managed-node2/stat 44071 1727204633.41200: worker is 1 (out of 1 available) 44071 1727204633.41216: exiting _queue_task() for managed-node2/stat 44071 1727204633.41231: done queuing things up, now waiting for results queue to drain 44071 1727204633.41233: waiting for pending results... 44071 1727204633.41570: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204633.41753: in run() - task 127b8e07-fff9-c964-7471-000000000d6f 44071 1727204633.41769: variable 'ansible_search_path' from source: unknown 44071 1727204633.41773: variable 'ansible_search_path' from source: unknown 44071 1727204633.41818: calling self._execute() 44071 1727204633.41927: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204633.41933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204633.41946: variable 'omit' from source: magic vars 44071 1727204633.42370: variable 'ansible_distribution_major_version' from source: facts 44071 1727204633.42381: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204633.42528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204633.42755: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204633.42793: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204633.42820: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204633.42853: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204633.42974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204633.42995: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204633.43015: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204633.43034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204633.43113: variable '__network_is_ostree' from source: set_fact 44071 1727204633.43120: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204633.43123: when evaluation is False, skipping this task 44071 1727204633.43126: _execute() done 44071 1727204633.43128: dumping result to json 44071 1727204633.43133: done dumping result, returning 44071 1727204633.43143: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-c964-7471-000000000d6f] 44071 1727204633.43146: sending task result for task 127b8e07-fff9-c964-7471-000000000d6f 44071 1727204633.43253: done sending task result for task 127b8e07-fff9-c964-7471-000000000d6f 44071 1727204633.43256: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204633.43312: no more pending results, returning what we have 44071 1727204633.43316: results queue empty 44071 1727204633.43317: checking for any_errors_fatal 44071 1727204633.43325: done checking for any_errors_fatal 44071 1727204633.43326: checking for max_fail_percentage 44071 1727204633.43327: done checking for max_fail_percentage 44071 1727204633.43328: checking to see if all hosts have failed and the running result is not ok 44071 1727204633.43329: done checking to see if all hosts have failed 44071 1727204633.43330: getting the remaining hosts for this loop 44071 1727204633.43331: done getting the remaining hosts for this loop 44071 1727204633.43337: getting the next task for host managed-node2 44071 1727204633.43349: done getting next task for host managed-node2 44071 1727204633.43354: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204633.43360: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204633.43382: getting variables 44071 1727204633.43383: in VariableManager get_vars() 44071 1727204633.43422: Calling all_inventory to load vars for managed-node2 44071 1727204633.43425: Calling groups_inventory to load vars for managed-node2 44071 1727204633.43427: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204633.43437: Calling all_plugins_play to load vars for managed-node2 44071 1727204633.43440: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204633.43446: Calling groups_plugins_play to load vars for managed-node2 44071 1727204633.45035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204633.46448: done with get_vars() 44071 1727204633.46479: done getting variables 44071 1727204633.46528: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:53 -0400 (0:00:00.057) 0:00:45.782 ***** 44071 1727204633.46563: entering _queue_task() for managed-node2/set_fact 44071 1727204633.46859: worker is 1 (out of 1 available) 44071 1727204633.46877: exiting _queue_task() for managed-node2/set_fact 44071 1727204633.46892: done queuing things up, now waiting for results queue to drain 44071 1727204633.46894: waiting for pending results... 44071 1727204633.47108: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204633.47246: in run() - task 127b8e07-fff9-c964-7471-000000000d70 44071 1727204633.47259: variable 'ansible_search_path' from source: unknown 44071 1727204633.47262: variable 'ansible_search_path' from source: unknown 44071 1727204633.47299: calling self._execute() 44071 1727204633.47387: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204633.47394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204633.47403: variable 'omit' from source: magic vars 44071 1727204633.47979: variable 'ansible_distribution_major_version' from source: facts 44071 1727204633.47984: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204633.48045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204633.48406: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204633.48483: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204633.48570: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204633.48589: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204633.48775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204633.48808: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204633.48840: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204633.48964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204633.49018: variable '__network_is_ostree' from source: set_fact 44071 1727204633.49032: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204633.49040: when evaluation is False, skipping this task 44071 1727204633.49049: _execute() done 44071 1727204633.49057: dumping result to json 44071 1727204633.49072: done dumping result, returning 44071 1727204633.49092: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-000000000d70] 44071 1727204633.49102: sending task result for task 127b8e07-fff9-c964-7471-000000000d70 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204633.49378: no more pending results, returning what we have 44071 1727204633.49382: results queue empty 44071 1727204633.49384: checking for any_errors_fatal 44071 1727204633.49393: done checking for any_errors_fatal 44071 1727204633.49394: checking for max_fail_percentage 44071 1727204633.49396: done checking for max_fail_percentage 44071 1727204633.49397: checking to see if all hosts have failed and the running result is not ok 44071 1727204633.49398: done checking to see if all hosts have failed 44071 1727204633.49398: getting the remaining hosts for this loop 44071 1727204633.49400: done getting the remaining hosts for this loop 44071 1727204633.49671: getting the next task for host managed-node2 44071 1727204633.49686: done getting next task for host managed-node2 44071 1727204633.49692: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204633.49699: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204633.49721: getting variables 44071 1727204633.49722: in VariableManager get_vars() 44071 1727204633.49778: Calling all_inventory to load vars for managed-node2 44071 1727204633.49782: Calling groups_inventory to load vars for managed-node2 44071 1727204633.49785: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204633.49796: Calling all_plugins_play to load vars for managed-node2 44071 1727204633.49799: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204633.49803: Calling groups_plugins_play to load vars for managed-node2 44071 1727204633.50415: done sending task result for task 127b8e07-fff9-c964-7471-000000000d70 44071 1727204633.50420: WORKER PROCESS EXITING 44071 1727204633.52139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204633.54721: done with get_vars() 44071 1727204633.54763: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:53 -0400 (0:00:00.084) 0:00:45.866 ***** 44071 1727204633.55009: entering _queue_task() for managed-node2/service_facts 44071 1727204633.55974: worker is 1 (out of 1 available) 44071 1727204633.55989: exiting _queue_task() for managed-node2/service_facts 44071 1727204633.56003: done queuing things up, now waiting for results queue to drain 44071 1727204633.56005: waiting for pending results... 44071 1727204633.56409: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204633.56616: in run() - task 127b8e07-fff9-c964-7471-000000000d72 44071 1727204633.56640: variable 'ansible_search_path' from source: unknown 44071 1727204633.56653: variable 'ansible_search_path' from source: unknown 44071 1727204633.56729: calling self._execute() 44071 1727204633.56867: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204633.56909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204633.56913: variable 'omit' from source: magic vars 44071 1727204633.57422: variable 'ansible_distribution_major_version' from source: facts 44071 1727204633.57445: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204633.57475: variable 'omit' from source: magic vars 44071 1727204633.57589: variable 'omit' from source: magic vars 44071 1727204633.57643: variable 'omit' from source: magic vars 44071 1727204633.57708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204633.57756: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204633.57788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204633.57828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204633.57851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204633.57891: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204633.57903: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204633.58026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204633.58064: Set connection var ansible_connection to ssh 44071 1727204633.58080: Set connection var ansible_timeout to 10 44071 1727204633.58092: Set connection var ansible_pipelining to False 44071 1727204633.58103: Set connection var ansible_shell_type to sh 44071 1727204633.58114: Set connection var ansible_shell_executable to /bin/sh 44071 1727204633.58146: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204633.58249: variable 'ansible_shell_executable' from source: unknown 44071 1727204633.58253: variable 'ansible_connection' from source: unknown 44071 1727204633.58256: variable 'ansible_module_compression' from source: unknown 44071 1727204633.58259: variable 'ansible_shell_type' from source: unknown 44071 1727204633.58262: variable 'ansible_shell_executable' from source: unknown 44071 1727204633.58267: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204633.58269: variable 'ansible_pipelining' from source: unknown 44071 1727204633.58272: variable 'ansible_timeout' from source: unknown 44071 1727204633.58275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204633.58577: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204633.58582: variable 'omit' from source: magic vars 44071 1727204633.58584: starting attempt loop 44071 1727204633.58586: running the handler 44071 1727204633.58588: _low_level_execute_command(): starting 44071 1727204633.58591: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204633.59869: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204633.60127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204633.60131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204633.60195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204633.60298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204633.62077: stdout chunk (state=3): >>>/root <<< 44071 1727204633.62389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204633.62596: stderr chunk (state=3): >>><<< 44071 1727204633.62600: stdout chunk (state=3): >>><<< 44071 1727204633.62672: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204633.62681: _low_level_execute_command(): starting 44071 1727204633.62691: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204633.626259-46683-252627537449998 `" && echo ansible-tmp-1727204633.626259-46683-252627537449998="` echo /root/.ansible/tmp/ansible-tmp-1727204633.626259-46683-252627537449998 `" ) && sleep 0' 44071 1727204633.64210: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204633.64224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204633.64295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204633.64316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204633.64426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204633.66443: stdout chunk (state=3): >>>ansible-tmp-1727204633.626259-46683-252627537449998=/root/.ansible/tmp/ansible-tmp-1727204633.626259-46683-252627537449998 <<< 44071 1727204633.66607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204633.66704: stderr chunk (state=3): >>><<< 44071 1727204633.66784: stdout chunk (state=3): >>><<< 44071 1727204633.66805: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204633.626259-46683-252627537449998=/root/.ansible/tmp/ansible-tmp-1727204633.626259-46683-252627537449998 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204633.66863: variable 'ansible_module_compression' from source: unknown 44071 1727204633.66913: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44071 1727204633.66954: variable 'ansible_facts' from source: unknown 44071 1727204633.67249: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204633.626259-46683-252627537449998/AnsiballZ_service_facts.py 44071 1727204633.67673: Sending initial data 44071 1727204633.67677: Sent initial data (161 bytes) 44071 1727204633.68734: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204633.69086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204633.69094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204633.69117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204633.69269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204633.70912: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204633.71009: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204633.71285: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp441bpyf4 /root/.ansible/tmp/ansible-tmp-1727204633.626259-46683-252627537449998/AnsiballZ_service_facts.py <<< 44071 1727204633.71304: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204633.626259-46683-252627537449998/AnsiballZ_service_facts.py" <<< 44071 1727204633.71307: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp441bpyf4" to remote "/root/.ansible/tmp/ansible-tmp-1727204633.626259-46683-252627537449998/AnsiballZ_service_facts.py" <<< 44071 1727204633.71313: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204633.626259-46683-252627537449998/AnsiballZ_service_facts.py" <<< 44071 1727204633.72670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204633.72987: stderr chunk (state=3): >>><<< 44071 1727204633.72991: stdout chunk (state=3): >>><<< 44071 1727204633.73012: done transferring module to remote 44071 1727204633.73025: _low_level_execute_command(): starting 44071 1727204633.73030: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204633.626259-46683-252627537449998/ /root/.ansible/tmp/ansible-tmp-1727204633.626259-46683-252627537449998/AnsiballZ_service_facts.py && sleep 0' 44071 1727204633.74263: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204633.74571: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204633.74602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204633.74606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204633.74721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204633.76558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204633.76659: stderr chunk (state=3): >>><<< 44071 1727204633.76776: stdout chunk (state=3): >>><<< 44071 1727204633.76780: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204633.76783: _low_level_execute_command(): starting 44071 1727204633.76786: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204633.626259-46683-252627537449998/AnsiballZ_service_facts.py && sleep 0' 44071 1727204633.78048: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204633.78064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204633.78089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204633.78239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204633.78246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204633.78285: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204633.78293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204633.78493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204636.01618: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped<<< 44071 1727204636.01669: stdout chunk (state=3): >>>", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.fre<<< 44071 1727204636.01719: stdout chunk (state=3): >>>edesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44071 1727204636.03367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204636.03385: stdout chunk (state=3): >>><<< 44071 1727204636.03416: stderr chunk (state=3): >>><<< 44071 1727204636.03576: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204636.04532: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204633.626259-46683-252627537449998/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204636.04557: _low_level_execute_command(): starting 44071 1727204636.04589: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204633.626259-46683-252627537449998/ > /dev/null 2>&1 && sleep 0' 44071 1727204636.05376: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204636.05427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204636.05447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204636.05508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204636.05600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204636.07495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204636.07554: stderr chunk (state=3): >>><<< 44071 1727204636.07558: stdout chunk (state=3): >>><<< 44071 1727204636.07575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204636.07582: handler run complete 44071 1727204636.07730: variable 'ansible_facts' from source: unknown 44071 1727204636.07870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204636.08218: variable 'ansible_facts' from source: unknown 44071 1727204636.08324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204636.08486: attempt loop complete, returning result 44071 1727204636.08492: _execute() done 44071 1727204636.08496: dumping result to json 44071 1727204636.08537: done dumping result, returning 44071 1727204636.08549: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-c964-7471-000000000d72] 44071 1727204636.08553: sending task result for task 127b8e07-fff9-c964-7471-000000000d72 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204636.09345: done sending task result for task 127b8e07-fff9-c964-7471-000000000d72 44071 1727204636.09349: WORKER PROCESS EXITING 44071 1727204636.09359: no more pending results, returning what we have 44071 1727204636.09361: results queue empty 44071 1727204636.09362: checking for any_errors_fatal 44071 1727204636.09368: done checking for any_errors_fatal 44071 1727204636.09368: checking for max_fail_percentage 44071 1727204636.09369: done checking for max_fail_percentage 44071 1727204636.09370: checking to see if all hosts have failed and the running result is not ok 44071 1727204636.09371: done checking to see if all hosts have failed 44071 1727204636.09371: getting the remaining hosts for this loop 44071 1727204636.09372: done getting the remaining hosts for this loop 44071 1727204636.09375: getting the next task for host managed-node2 44071 1727204636.09380: done getting next task for host managed-node2 44071 1727204636.09383: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204636.09388: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204636.09396: getting variables 44071 1727204636.09397: in VariableManager get_vars() 44071 1727204636.09421: Calling all_inventory to load vars for managed-node2 44071 1727204636.09423: Calling groups_inventory to load vars for managed-node2 44071 1727204636.09425: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204636.09432: Calling all_plugins_play to load vars for managed-node2 44071 1727204636.09434: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204636.09437: Calling groups_plugins_play to load vars for managed-node2 44071 1727204636.10473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204636.11701: done with get_vars() 44071 1727204636.11733: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:56 -0400 (0:00:02.568) 0:00:48.434 ***** 44071 1727204636.11816: entering _queue_task() for managed-node2/package_facts 44071 1727204636.12108: worker is 1 (out of 1 available) 44071 1727204636.12125: exiting _queue_task() for managed-node2/package_facts 44071 1727204636.12140: done queuing things up, now waiting for results queue to drain 44071 1727204636.12141: waiting for pending results... 44071 1727204636.12350: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204636.12486: in run() - task 127b8e07-fff9-c964-7471-000000000d73 44071 1727204636.12498: variable 'ansible_search_path' from source: unknown 44071 1727204636.12502: variable 'ansible_search_path' from source: unknown 44071 1727204636.12533: calling self._execute() 44071 1727204636.12622: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204636.12628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204636.12636: variable 'omit' from source: magic vars 44071 1727204636.12956: variable 'ansible_distribution_major_version' from source: facts 44071 1727204636.12968: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204636.12975: variable 'omit' from source: magic vars 44071 1727204636.13039: variable 'omit' from source: magic vars 44071 1727204636.13070: variable 'omit' from source: magic vars 44071 1727204636.13107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204636.13140: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204636.13161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204636.13178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204636.13189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204636.13213: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204636.13217: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204636.13220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204636.13303: Set connection var ansible_connection to ssh 44071 1727204636.13309: Set connection var ansible_timeout to 10 44071 1727204636.13314: Set connection var ansible_pipelining to False 44071 1727204636.13320: Set connection var ansible_shell_type to sh 44071 1727204636.13325: Set connection var ansible_shell_executable to /bin/sh 44071 1727204636.13332: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204636.13354: variable 'ansible_shell_executable' from source: unknown 44071 1727204636.13359: variable 'ansible_connection' from source: unknown 44071 1727204636.13362: variable 'ansible_module_compression' from source: unknown 44071 1727204636.13365: variable 'ansible_shell_type' from source: unknown 44071 1727204636.13368: variable 'ansible_shell_executable' from source: unknown 44071 1727204636.13371: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204636.13374: variable 'ansible_pipelining' from source: unknown 44071 1727204636.13376: variable 'ansible_timeout' from source: unknown 44071 1727204636.13384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204636.13547: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204636.13557: variable 'omit' from source: magic vars 44071 1727204636.13561: starting attempt loop 44071 1727204636.13565: running the handler 44071 1727204636.13582: _low_level_execute_command(): starting 44071 1727204636.13589: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204636.14168: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204636.14174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204636.14178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204636.14219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204636.14223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204636.14225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204636.14301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204636.16005: stdout chunk (state=3): >>>/root <<< 44071 1727204636.16102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204636.16169: stderr chunk (state=3): >>><<< 44071 1727204636.16173: stdout chunk (state=3): >>><<< 44071 1727204636.16198: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204636.16211: _low_level_execute_command(): starting 44071 1727204636.16222: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204636.1619585-46982-205787469602602 `" && echo ansible-tmp-1727204636.1619585-46982-205787469602602="` echo /root/.ansible/tmp/ansible-tmp-1727204636.1619585-46982-205787469602602 `" ) && sleep 0' 44071 1727204636.16723: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204636.16727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204636.16729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204636.16739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204636.16742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204636.16795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204636.16799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204636.16803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204636.16878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204636.18859: stdout chunk (state=3): >>>ansible-tmp-1727204636.1619585-46982-205787469602602=/root/.ansible/tmp/ansible-tmp-1727204636.1619585-46982-205787469602602 <<< 44071 1727204636.18987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204636.19033: stderr chunk (state=3): >>><<< 44071 1727204636.19037: stdout chunk (state=3): >>><<< 44071 1727204636.19054: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204636.1619585-46982-205787469602602=/root/.ansible/tmp/ansible-tmp-1727204636.1619585-46982-205787469602602 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204636.19099: variable 'ansible_module_compression' from source: unknown 44071 1727204636.19148: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44071 1727204636.19205: variable 'ansible_facts' from source: unknown 44071 1727204636.19331: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204636.1619585-46982-205787469602602/AnsiballZ_package_facts.py 44071 1727204636.19464: Sending initial data 44071 1727204636.19470: Sent initial data (162 bytes) 44071 1727204636.19948: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204636.19952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204636.19984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204636.19988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204636.19990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204636.20057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204636.20060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204636.20070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204636.20134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204636.21750: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204636.21812: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204636.21885: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbv9qv64h /root/.ansible/tmp/ansible-tmp-1727204636.1619585-46982-205787469602602/AnsiballZ_package_facts.py <<< 44071 1727204636.21888: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204636.1619585-46982-205787469602602/AnsiballZ_package_facts.py" <<< 44071 1727204636.21970: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbv9qv64h" to remote "/root/.ansible/tmp/ansible-tmp-1727204636.1619585-46982-205787469602602/AnsiballZ_package_facts.py" <<< 44071 1727204636.21974: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204636.1619585-46982-205787469602602/AnsiballZ_package_facts.py" <<< 44071 1727204636.23221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204636.23300: stderr chunk (state=3): >>><<< 44071 1727204636.23304: stdout chunk (state=3): >>><<< 44071 1727204636.23324: done transferring module to remote 44071 1727204636.23335: _low_level_execute_command(): starting 44071 1727204636.23340: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204636.1619585-46982-205787469602602/ /root/.ansible/tmp/ansible-tmp-1727204636.1619585-46982-205787469602602/AnsiballZ_package_facts.py && sleep 0' 44071 1727204636.23831: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204636.23835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204636.23839: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204636.23897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204636.23901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204636.23907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204636.23979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204636.25907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204636.25911: stdout chunk (state=3): >>><<< 44071 1727204636.25914: stderr chunk (state=3): >>><<< 44071 1727204636.25929: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204636.26021: _low_level_execute_command(): starting 44071 1727204636.26026: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204636.1619585-46982-205787469602602/AnsiballZ_package_facts.py && sleep 0' 44071 1727204636.26593: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204636.26608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204636.26660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204636.26681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204636.26759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204636.89435: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 44071 1727204636.89572: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_<<< 44071 1727204636.89635: stdout chunk (state=3): >>>64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 44071 1727204636.89660: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44071 1727204636.91599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204636.91617: stderr chunk (state=3): >>><<< 44071 1727204636.91629: stdout chunk (state=3): >>><<< 44071 1727204636.91685: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204636.95068: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204636.1619585-46982-205787469602602/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204636.95112: _low_level_execute_command(): starting 44071 1727204636.95124: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204636.1619585-46982-205787469602602/ > /dev/null 2>&1 && sleep 0' 44071 1727204636.95882: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204636.95973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204636.96021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204636.96043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204636.96105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204636.96193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204636.98373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204636.98377: stderr chunk (state=3): >>><<< 44071 1727204636.98380: stdout chunk (state=3): >>><<< 44071 1727204636.98383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204636.98385: handler run complete 44071 1727204636.99564: variable 'ansible_facts' from source: unknown 44071 1727204637.00244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204637.03213: variable 'ansible_facts' from source: unknown 44071 1727204637.03859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204637.04907: attempt loop complete, returning result 44071 1727204637.04929: _execute() done 44071 1727204637.04933: dumping result to json 44071 1727204637.05247: done dumping result, returning 44071 1727204637.05259: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-c964-7471-000000000d73] 44071 1727204637.05276: sending task result for task 127b8e07-fff9-c964-7471-000000000d73 44071 1727204637.10692: done sending task result for task 127b8e07-fff9-c964-7471-000000000d73 44071 1727204637.10696: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204637.10994: no more pending results, returning what we have 44071 1727204637.10997: results queue empty 44071 1727204637.10998: checking for any_errors_fatal 44071 1727204637.11005: done checking for any_errors_fatal 44071 1727204637.11006: checking for max_fail_percentage 44071 1727204637.11007: done checking for max_fail_percentage 44071 1727204637.11008: checking to see if all hosts have failed and the running result is not ok 44071 1727204637.11009: done checking to see if all hosts have failed 44071 1727204637.11010: getting the remaining hosts for this loop 44071 1727204637.11011: done getting the remaining hosts for this loop 44071 1727204637.11016: getting the next task for host managed-node2 44071 1727204637.11024: done getting next task for host managed-node2 44071 1727204637.11028: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204637.11033: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204637.11049: getting variables 44071 1727204637.11051: in VariableManager get_vars() 44071 1727204637.11089: Calling all_inventory to load vars for managed-node2 44071 1727204637.11092: Calling groups_inventory to load vars for managed-node2 44071 1727204637.11095: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204637.11106: Calling all_plugins_play to load vars for managed-node2 44071 1727204637.11109: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204637.11112: Calling groups_plugins_play to load vars for managed-node2 44071 1727204637.15208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204637.17781: done with get_vars() 44071 1727204637.17826: done getting variables 44071 1727204637.17905: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:57 -0400 (0:00:01.061) 0:00:49.496 ***** 44071 1727204637.17958: entering _queue_task() for managed-node2/debug 44071 1727204637.18375: worker is 1 (out of 1 available) 44071 1727204637.18503: exiting _queue_task() for managed-node2/debug 44071 1727204637.18516: done queuing things up, now waiting for results queue to drain 44071 1727204637.18517: waiting for pending results... 44071 1727204637.18783: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204637.18965: in run() - task 127b8e07-fff9-c964-7471-000000000d17 44071 1727204637.19037: variable 'ansible_search_path' from source: unknown 44071 1727204637.19041: variable 'ansible_search_path' from source: unknown 44071 1727204637.19047: calling self._execute() 44071 1727204637.19159: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204637.19174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204637.19190: variable 'omit' from source: magic vars 44071 1727204637.19644: variable 'ansible_distribution_major_version' from source: facts 44071 1727204637.19664: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204637.19693: variable 'omit' from source: magic vars 44071 1727204637.19769: variable 'omit' from source: magic vars 44071 1727204637.19911: variable 'network_provider' from source: set_fact 44071 1727204637.19926: variable 'omit' from source: magic vars 44071 1727204637.20020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204637.20032: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204637.20064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204637.20091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204637.20109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204637.20155: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204637.20165: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204637.20239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204637.20299: Set connection var ansible_connection to ssh 44071 1727204637.20311: Set connection var ansible_timeout to 10 44071 1727204637.20321: Set connection var ansible_pipelining to False 44071 1727204637.20331: Set connection var ansible_shell_type to sh 44071 1727204637.20351: Set connection var ansible_shell_executable to /bin/sh 44071 1727204637.20367: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204637.20398: variable 'ansible_shell_executable' from source: unknown 44071 1727204637.20406: variable 'ansible_connection' from source: unknown 44071 1727204637.20413: variable 'ansible_module_compression' from source: unknown 44071 1727204637.20419: variable 'ansible_shell_type' from source: unknown 44071 1727204637.20427: variable 'ansible_shell_executable' from source: unknown 44071 1727204637.20433: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204637.20440: variable 'ansible_pipelining' from source: unknown 44071 1727204637.20567: variable 'ansible_timeout' from source: unknown 44071 1727204637.20572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204637.20627: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204637.20648: variable 'omit' from source: magic vars 44071 1727204637.20658: starting attempt loop 44071 1727204637.20664: running the handler 44071 1727204637.20724: handler run complete 44071 1727204637.20748: attempt loop complete, returning result 44071 1727204637.20755: _execute() done 44071 1727204637.20763: dumping result to json 44071 1727204637.20772: done dumping result, returning 44071 1727204637.20791: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-c964-7471-000000000d17] 44071 1727204637.20893: sending task result for task 127b8e07-fff9-c964-7471-000000000d17 44071 1727204637.20976: done sending task result for task 127b8e07-fff9-c964-7471-000000000d17 44071 1727204637.20980: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 44071 1727204637.21074: no more pending results, returning what we have 44071 1727204637.21078: results queue empty 44071 1727204637.21079: checking for any_errors_fatal 44071 1727204637.21089: done checking for any_errors_fatal 44071 1727204637.21090: checking for max_fail_percentage 44071 1727204637.21091: done checking for max_fail_percentage 44071 1727204637.21092: checking to see if all hosts have failed and the running result is not ok 44071 1727204637.21093: done checking to see if all hosts have failed 44071 1727204637.21094: getting the remaining hosts for this loop 44071 1727204637.21095: done getting the remaining hosts for this loop 44071 1727204637.21101: getting the next task for host managed-node2 44071 1727204637.21111: done getting next task for host managed-node2 44071 1727204637.21115: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204637.21121: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204637.21136: getting variables 44071 1727204637.21138: in VariableManager get_vars() 44071 1727204637.21292: Calling all_inventory to load vars for managed-node2 44071 1727204637.21295: Calling groups_inventory to load vars for managed-node2 44071 1727204637.21297: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204637.21312: Calling all_plugins_play to load vars for managed-node2 44071 1727204637.21315: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204637.21318: Calling groups_plugins_play to load vars for managed-node2 44071 1727204637.23539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204637.25874: done with get_vars() 44071 1727204637.25918: done getting variables 44071 1727204637.25996: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.080) 0:00:49.576 ***** 44071 1727204637.26045: entering _queue_task() for managed-node2/fail 44071 1727204637.26461: worker is 1 (out of 1 available) 44071 1727204637.26623: exiting _queue_task() for managed-node2/fail 44071 1727204637.26636: done queuing things up, now waiting for results queue to drain 44071 1727204637.26637: waiting for pending results... 44071 1727204637.26882: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204637.27036: in run() - task 127b8e07-fff9-c964-7471-000000000d18 44071 1727204637.27083: variable 'ansible_search_path' from source: unknown 44071 1727204637.27087: variable 'ansible_search_path' from source: unknown 44071 1727204637.27123: calling self._execute() 44071 1727204637.27268: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204637.27274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204637.27278: variable 'omit' from source: magic vars 44071 1727204637.27812: variable 'ansible_distribution_major_version' from source: facts 44071 1727204637.27817: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204637.27933: variable 'network_state' from source: role '' defaults 44071 1727204637.27960: Evaluated conditional (network_state != {}): False 44071 1727204637.27971: when evaluation is False, skipping this task 44071 1727204637.27979: _execute() done 44071 1727204637.27987: dumping result to json 44071 1727204637.27994: done dumping result, returning 44071 1727204637.28007: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-c964-7471-000000000d18] 44071 1727204637.28017: sending task result for task 127b8e07-fff9-c964-7471-000000000d18 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204637.28220: no more pending results, returning what we have 44071 1727204637.28224: results queue empty 44071 1727204637.28225: checking for any_errors_fatal 44071 1727204637.28233: done checking for any_errors_fatal 44071 1727204637.28234: checking for max_fail_percentage 44071 1727204637.28235: done checking for max_fail_percentage 44071 1727204637.28237: checking to see if all hosts have failed and the running result is not ok 44071 1727204637.28237: done checking to see if all hosts have failed 44071 1727204637.28238: getting the remaining hosts for this loop 44071 1727204637.28240: done getting the remaining hosts for this loop 44071 1727204637.28248: getting the next task for host managed-node2 44071 1727204637.28259: done getting next task for host managed-node2 44071 1727204637.28266: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204637.28274: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204637.28300: getting variables 44071 1727204637.28303: in VariableManager get_vars() 44071 1727204637.28350: Calling all_inventory to load vars for managed-node2 44071 1727204637.28354: Calling groups_inventory to load vars for managed-node2 44071 1727204637.28357: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204637.28582: Calling all_plugins_play to load vars for managed-node2 44071 1727204637.28587: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204637.28687: Calling groups_plugins_play to load vars for managed-node2 44071 1727204637.29304: done sending task result for task 127b8e07-fff9-c964-7471-000000000d18 44071 1727204637.29309: WORKER PROCESS EXITING 44071 1727204637.30607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204637.32997: done with get_vars() 44071 1727204637.33052: done getting variables 44071 1727204637.33129: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.071) 0:00:49.648 ***** 44071 1727204637.33176: entering _queue_task() for managed-node2/fail 44071 1727204637.33807: worker is 1 (out of 1 available) 44071 1727204637.33819: exiting _queue_task() for managed-node2/fail 44071 1727204637.33832: done queuing things up, now waiting for results queue to drain 44071 1727204637.33833: waiting for pending results... 44071 1727204637.34009: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204637.34198: in run() - task 127b8e07-fff9-c964-7471-000000000d19 44071 1727204637.34221: variable 'ansible_search_path' from source: unknown 44071 1727204637.34229: variable 'ansible_search_path' from source: unknown 44071 1727204637.34286: calling self._execute() 44071 1727204637.34399: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204637.34417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204637.34434: variable 'omit' from source: magic vars 44071 1727204637.34892: variable 'ansible_distribution_major_version' from source: facts 44071 1727204637.34914: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204637.35078: variable 'network_state' from source: role '' defaults 44071 1727204637.35098: Evaluated conditional (network_state != {}): False 44071 1727204637.35107: when evaluation is False, skipping this task 44071 1727204637.35114: _execute() done 44071 1727204637.35122: dumping result to json 44071 1727204637.35129: done dumping result, returning 44071 1727204637.35148: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-c964-7471-000000000d19] 44071 1727204637.35256: sending task result for task 127b8e07-fff9-c964-7471-000000000d19 44071 1727204637.35351: done sending task result for task 127b8e07-fff9-c964-7471-000000000d19 44071 1727204637.35354: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204637.35423: no more pending results, returning what we have 44071 1727204637.35427: results queue empty 44071 1727204637.35428: checking for any_errors_fatal 44071 1727204637.35439: done checking for any_errors_fatal 44071 1727204637.35440: checking for max_fail_percentage 44071 1727204637.35444: done checking for max_fail_percentage 44071 1727204637.35446: checking to see if all hosts have failed and the running result is not ok 44071 1727204637.35446: done checking to see if all hosts have failed 44071 1727204637.35447: getting the remaining hosts for this loop 44071 1727204637.35449: done getting the remaining hosts for this loop 44071 1727204637.35455: getting the next task for host managed-node2 44071 1727204637.35468: done getting next task for host managed-node2 44071 1727204637.35472: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204637.35480: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204637.35508: getting variables 44071 1727204637.35510: in VariableManager get_vars() 44071 1727204637.35559: Calling all_inventory to load vars for managed-node2 44071 1727204637.35562: Calling groups_inventory to load vars for managed-node2 44071 1727204637.35564: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204637.35698: Calling all_plugins_play to load vars for managed-node2 44071 1727204637.35702: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204637.35705: Calling groups_plugins_play to load vars for managed-node2 44071 1727204637.37900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204637.40269: done with get_vars() 44071 1727204637.40311: done getting variables 44071 1727204637.40389: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.072) 0:00:49.720 ***** 44071 1727204637.40428: entering _queue_task() for managed-node2/fail 44071 1727204637.40854: worker is 1 (out of 1 available) 44071 1727204637.41077: exiting _queue_task() for managed-node2/fail 44071 1727204637.41091: done queuing things up, now waiting for results queue to drain 44071 1727204637.41092: waiting for pending results... 44071 1727204637.41290: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204637.41436: in run() - task 127b8e07-fff9-c964-7471-000000000d1a 44071 1727204637.41459: variable 'ansible_search_path' from source: unknown 44071 1727204637.41510: variable 'ansible_search_path' from source: unknown 44071 1727204637.41521: calling self._execute() 44071 1727204637.41650: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204637.41664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204637.41681: variable 'omit' from source: magic vars 44071 1727204637.42126: variable 'ansible_distribution_major_version' from source: facts 44071 1727204637.42166: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204637.42471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204637.45215: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204637.45317: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204637.45373: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204637.45425: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204637.45462: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204637.45580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204637.45602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204637.45628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.45679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204637.45691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204637.45783: variable 'ansible_distribution_major_version' from source: facts 44071 1727204637.45799: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44071 1727204637.45907: variable 'ansible_distribution' from source: facts 44071 1727204637.45910: variable '__network_rh_distros' from source: role '' defaults 44071 1727204637.45920: Evaluated conditional (ansible_distribution in __network_rh_distros): False 44071 1727204637.45923: when evaluation is False, skipping this task 44071 1727204637.45926: _execute() done 44071 1727204637.45928: dumping result to json 44071 1727204637.45932: done dumping result, returning 44071 1727204637.45940: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-c964-7471-000000000d1a] 44071 1727204637.45946: sending task result for task 127b8e07-fff9-c964-7471-000000000d1a 44071 1727204637.46052: done sending task result for task 127b8e07-fff9-c964-7471-000000000d1a 44071 1727204637.46057: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 44071 1727204637.46108: no more pending results, returning what we have 44071 1727204637.46112: results queue empty 44071 1727204637.46113: checking for any_errors_fatal 44071 1727204637.46122: done checking for any_errors_fatal 44071 1727204637.46123: checking for max_fail_percentage 44071 1727204637.46124: done checking for max_fail_percentage 44071 1727204637.46125: checking to see if all hosts have failed and the running result is not ok 44071 1727204637.46126: done checking to see if all hosts have failed 44071 1727204637.46127: getting the remaining hosts for this loop 44071 1727204637.46129: done getting the remaining hosts for this loop 44071 1727204637.46134: getting the next task for host managed-node2 44071 1727204637.46145: done getting next task for host managed-node2 44071 1727204637.46150: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204637.46155: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204637.46179: getting variables 44071 1727204637.46181: in VariableManager get_vars() 44071 1727204637.46221: Calling all_inventory to load vars for managed-node2 44071 1727204637.46223: Calling groups_inventory to load vars for managed-node2 44071 1727204637.46225: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204637.46236: Calling all_plugins_play to load vars for managed-node2 44071 1727204637.46239: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204637.46244: Calling groups_plugins_play to load vars for managed-node2 44071 1727204637.47452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204637.49325: done with get_vars() 44071 1727204637.49360: done getting variables 44071 1727204637.49419: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.090) 0:00:49.810 ***** 44071 1727204637.49449: entering _queue_task() for managed-node2/dnf 44071 1727204637.49761: worker is 1 (out of 1 available) 44071 1727204637.49779: exiting _queue_task() for managed-node2/dnf 44071 1727204637.49794: done queuing things up, now waiting for results queue to drain 44071 1727204637.49796: waiting for pending results... 44071 1727204637.50005: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204637.50113: in run() - task 127b8e07-fff9-c964-7471-000000000d1b 44071 1727204637.50126: variable 'ansible_search_path' from source: unknown 44071 1727204637.50130: variable 'ansible_search_path' from source: unknown 44071 1727204637.50169: calling self._execute() 44071 1727204637.50247: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204637.50262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204637.50276: variable 'omit' from source: magic vars 44071 1727204637.50598: variable 'ansible_distribution_major_version' from source: facts 44071 1727204637.50609: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204637.50770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204637.53248: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204637.53292: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204637.53333: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204637.53455: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204637.53459: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204637.53491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204637.53524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204637.53576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.53596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204637.53617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204637.53758: variable 'ansible_distribution' from source: facts 44071 1727204637.53871: variable 'ansible_distribution_major_version' from source: facts 44071 1727204637.53874: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44071 1727204637.53910: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204637.54057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204637.54105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204637.54135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.54184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204637.54204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204637.54255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204637.54288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204637.54317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.54363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204637.54385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204637.54435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204637.54464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204637.54498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.54543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204637.54563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204637.54871: variable 'network_connections' from source: include params 44071 1727204637.54874: variable 'interface' from source: play vars 44071 1727204637.54876: variable 'interface' from source: play vars 44071 1727204637.54934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204637.55147: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204637.55206: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204637.55246: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204637.55282: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204637.55338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204637.55370: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204637.55411: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.55442: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204637.55502: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204637.55784: variable 'network_connections' from source: include params 44071 1727204637.55797: variable 'interface' from source: play vars 44071 1727204637.55873: variable 'interface' from source: play vars 44071 1727204637.55907: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204637.55915: when evaluation is False, skipping this task 44071 1727204637.55923: _execute() done 44071 1727204637.55930: dumping result to json 44071 1727204637.55938: done dumping result, returning 44071 1727204637.55950: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000000d1b] 44071 1727204637.55959: sending task result for task 127b8e07-fff9-c964-7471-000000000d1b skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204637.56155: no more pending results, returning what we have 44071 1727204637.56158: results queue empty 44071 1727204637.56159: checking for any_errors_fatal 44071 1727204637.56171: done checking for any_errors_fatal 44071 1727204637.56171: checking for max_fail_percentage 44071 1727204637.56174: done checking for max_fail_percentage 44071 1727204637.56174: checking to see if all hosts have failed and the running result is not ok 44071 1727204637.56175: done checking to see if all hosts have failed 44071 1727204637.56176: getting the remaining hosts for this loop 44071 1727204637.56177: done getting the remaining hosts for this loop 44071 1727204637.56182: getting the next task for host managed-node2 44071 1727204637.56191: done getting next task for host managed-node2 44071 1727204637.56196: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204637.56201: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204637.56228: getting variables 44071 1727204637.56230: in VariableManager get_vars() 44071 1727204637.56481: Calling all_inventory to load vars for managed-node2 44071 1727204637.56485: Calling groups_inventory to load vars for managed-node2 44071 1727204637.56487: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204637.56494: done sending task result for task 127b8e07-fff9-c964-7471-000000000d1b 44071 1727204637.56497: WORKER PROCESS EXITING 44071 1727204637.56507: Calling all_plugins_play to load vars for managed-node2 44071 1727204637.56511: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204637.56514: Calling groups_plugins_play to load vars for managed-node2 44071 1727204637.58418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204637.60897: done with get_vars() 44071 1727204637.60954: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204637.61048: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.116) 0:00:49.927 ***** 44071 1727204637.61093: entering _queue_task() for managed-node2/yum 44071 1727204637.61632: worker is 1 (out of 1 available) 44071 1727204637.61651: exiting _queue_task() for managed-node2/yum 44071 1727204637.61664: done queuing things up, now waiting for results queue to drain 44071 1727204637.61668: waiting for pending results... 44071 1727204637.61988: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204637.62221: in run() - task 127b8e07-fff9-c964-7471-000000000d1c 44071 1727204637.62226: variable 'ansible_search_path' from source: unknown 44071 1727204637.62229: variable 'ansible_search_path' from source: unknown 44071 1727204637.62257: calling self._execute() 44071 1727204637.62403: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204637.62417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204637.62445: variable 'omit' from source: magic vars 44071 1727204637.62988: variable 'ansible_distribution_major_version' from source: facts 44071 1727204637.63073: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204637.63269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204637.66393: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204637.66516: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204637.66578: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204637.66619: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204637.66656: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204637.66770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204637.66971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204637.66975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.66978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204637.66980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204637.67060: variable 'ansible_distribution_major_version' from source: facts 44071 1727204637.67099: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44071 1727204637.67113: when evaluation is False, skipping this task 44071 1727204637.67121: _execute() done 44071 1727204637.67130: dumping result to json 44071 1727204637.67138: done dumping result, returning 44071 1727204637.67156: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000000d1c] 44071 1727204637.67171: sending task result for task 127b8e07-fff9-c964-7471-000000000d1c 44071 1727204637.67486: done sending task result for task 127b8e07-fff9-c964-7471-000000000d1c 44071 1727204637.67489: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44071 1727204637.67552: no more pending results, returning what we have 44071 1727204637.67556: results queue empty 44071 1727204637.67558: checking for any_errors_fatal 44071 1727204637.67569: done checking for any_errors_fatal 44071 1727204637.67570: checking for max_fail_percentage 44071 1727204637.67572: done checking for max_fail_percentage 44071 1727204637.67573: checking to see if all hosts have failed and the running result is not ok 44071 1727204637.67574: done checking to see if all hosts have failed 44071 1727204637.67575: getting the remaining hosts for this loop 44071 1727204637.67584: done getting the remaining hosts for this loop 44071 1727204637.67590: getting the next task for host managed-node2 44071 1727204637.67599: done getting next task for host managed-node2 44071 1727204637.67604: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204637.67609: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204637.67635: getting variables 44071 1727204637.67637: in VariableManager get_vars() 44071 1727204637.67823: Calling all_inventory to load vars for managed-node2 44071 1727204637.67827: Calling groups_inventory to load vars for managed-node2 44071 1727204637.67830: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204637.67844: Calling all_plugins_play to load vars for managed-node2 44071 1727204637.67848: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204637.67851: Calling groups_plugins_play to load vars for managed-node2 44071 1727204637.70062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204637.72545: done with get_vars() 44071 1727204637.72589: done getting variables 44071 1727204637.72659: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.116) 0:00:50.043 ***** 44071 1727204637.72699: entering _queue_task() for managed-node2/fail 44071 1727204637.73083: worker is 1 (out of 1 available) 44071 1727204637.73099: exiting _queue_task() for managed-node2/fail 44071 1727204637.73113: done queuing things up, now waiting for results queue to drain 44071 1727204637.73115: waiting for pending results... 44071 1727204637.73347: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204637.73455: in run() - task 127b8e07-fff9-c964-7471-000000000d1d 44071 1727204637.73473: variable 'ansible_search_path' from source: unknown 44071 1727204637.73477: variable 'ansible_search_path' from source: unknown 44071 1727204637.73510: calling self._execute() 44071 1727204637.73603: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204637.73608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204637.73617: variable 'omit' from source: magic vars 44071 1727204637.73951: variable 'ansible_distribution_major_version' from source: facts 44071 1727204637.73962: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204637.74066: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204637.74220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204637.76496: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204637.76681: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204637.76685: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204637.76687: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204637.76704: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204637.76798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204637.76837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204637.76874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.76923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204637.76944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204637.77007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204637.77038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204637.77071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.77119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204637.77141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204637.77198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204637.77216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204637.77261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.77291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204637.77303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204637.77448: variable 'network_connections' from source: include params 44071 1727204637.77459: variable 'interface' from source: play vars 44071 1727204637.77528: variable 'interface' from source: play vars 44071 1727204637.77591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204637.77738: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204637.77772: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204637.77802: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204637.77825: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204637.77862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204637.77882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204637.77900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.77922: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204637.77968: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204637.78156: variable 'network_connections' from source: include params 44071 1727204637.78162: variable 'interface' from source: play vars 44071 1727204637.78216: variable 'interface' from source: play vars 44071 1727204637.78237: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204637.78244: when evaluation is False, skipping this task 44071 1727204637.78247: _execute() done 44071 1727204637.78249: dumping result to json 44071 1727204637.78253: done dumping result, returning 44071 1727204637.78259: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000000d1d] 44071 1727204637.78263: sending task result for task 127b8e07-fff9-c964-7471-000000000d1d 44071 1727204637.78372: done sending task result for task 127b8e07-fff9-c964-7471-000000000d1d 44071 1727204637.78376: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204637.78432: no more pending results, returning what we have 44071 1727204637.78436: results queue empty 44071 1727204637.78437: checking for any_errors_fatal 44071 1727204637.78448: done checking for any_errors_fatal 44071 1727204637.78449: checking for max_fail_percentage 44071 1727204637.78450: done checking for max_fail_percentage 44071 1727204637.78451: checking to see if all hosts have failed and the running result is not ok 44071 1727204637.78452: done checking to see if all hosts have failed 44071 1727204637.78452: getting the remaining hosts for this loop 44071 1727204637.78454: done getting the remaining hosts for this loop 44071 1727204637.78459: getting the next task for host managed-node2 44071 1727204637.78470: done getting next task for host managed-node2 44071 1727204637.78474: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44071 1727204637.78479: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204637.78503: getting variables 44071 1727204637.78504: in VariableManager get_vars() 44071 1727204637.78544: Calling all_inventory to load vars for managed-node2 44071 1727204637.78547: Calling groups_inventory to load vars for managed-node2 44071 1727204637.78549: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204637.78560: Calling all_plugins_play to load vars for managed-node2 44071 1727204637.78562: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204637.78568: Calling groups_plugins_play to load vars for managed-node2 44071 1727204637.85334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204637.86533: done with get_vars() 44071 1727204637.86569: done getting variables 44071 1727204637.86612: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.139) 0:00:50.182 ***** 44071 1727204637.86636: entering _queue_task() for managed-node2/package 44071 1727204637.86941: worker is 1 (out of 1 available) 44071 1727204637.86957: exiting _queue_task() for managed-node2/package 44071 1727204637.86972: done queuing things up, now waiting for results queue to drain 44071 1727204637.86974: waiting for pending results... 44071 1727204637.87191: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 44071 1727204637.87325: in run() - task 127b8e07-fff9-c964-7471-000000000d1e 44071 1727204637.87338: variable 'ansible_search_path' from source: unknown 44071 1727204637.87343: variable 'ansible_search_path' from source: unknown 44071 1727204637.87380: calling self._execute() 44071 1727204637.87471: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204637.87477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204637.87487: variable 'omit' from source: magic vars 44071 1727204637.87833: variable 'ansible_distribution_major_version' from source: facts 44071 1727204637.87844: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204637.88008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204637.88234: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204637.88277: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204637.88344: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204637.88387: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204637.88489: variable 'network_packages' from source: role '' defaults 44071 1727204637.88579: variable '__network_provider_setup' from source: role '' defaults 44071 1727204637.88590: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204637.88644: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204637.88655: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204637.88701: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204637.88844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204637.90412: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204637.90466: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204637.90498: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204637.90523: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204637.90555: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204637.90625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204637.90649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204637.90670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.90702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204637.90714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204637.90752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204637.90771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204637.90789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.90822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204637.90834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204637.91005: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204637.91096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204637.91115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204637.91136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.91169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204637.91180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204637.91255: variable 'ansible_python' from source: facts 44071 1727204637.91272: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204637.91333: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204637.91400: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204637.91499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204637.91518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204637.91535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.91573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204637.91583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204637.91621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204637.91643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204637.91663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.91696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204637.91707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204637.91822: variable 'network_connections' from source: include params 44071 1727204637.91829: variable 'interface' from source: play vars 44071 1727204637.91917: variable 'interface' from source: play vars 44071 1727204637.91983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204637.92008: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204637.92032: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204637.92058: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204637.92101: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204637.92315: variable 'network_connections' from source: include params 44071 1727204637.92318: variable 'interface' from source: play vars 44071 1727204637.92404: variable 'interface' from source: play vars 44071 1727204637.92433: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204637.92499: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204637.92720: variable 'network_connections' from source: include params 44071 1727204637.92723: variable 'interface' from source: play vars 44071 1727204637.92780: variable 'interface' from source: play vars 44071 1727204637.92797: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204637.92856: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204637.93079: variable 'network_connections' from source: include params 44071 1727204637.93083: variable 'interface' from source: play vars 44071 1727204637.93134: variable 'interface' from source: play vars 44071 1727204637.93180: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204637.93228: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204637.93234: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204637.93282: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204637.93437: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204637.93787: variable 'network_connections' from source: include params 44071 1727204637.93791: variable 'interface' from source: play vars 44071 1727204637.93838: variable 'interface' from source: play vars 44071 1727204637.93848: variable 'ansible_distribution' from source: facts 44071 1727204637.93851: variable '__network_rh_distros' from source: role '' defaults 44071 1727204637.93860: variable 'ansible_distribution_major_version' from source: facts 44071 1727204637.93874: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204637.93997: variable 'ansible_distribution' from source: facts 44071 1727204637.94000: variable '__network_rh_distros' from source: role '' defaults 44071 1727204637.94006: variable 'ansible_distribution_major_version' from source: facts 44071 1727204637.94012: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204637.94132: variable 'ansible_distribution' from source: facts 44071 1727204637.94136: variable '__network_rh_distros' from source: role '' defaults 44071 1727204637.94141: variable 'ansible_distribution_major_version' from source: facts 44071 1727204637.94171: variable 'network_provider' from source: set_fact 44071 1727204637.94185: variable 'ansible_facts' from source: unknown 44071 1727204637.94788: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44071 1727204637.94793: when evaluation is False, skipping this task 44071 1727204637.94795: _execute() done 44071 1727204637.94798: dumping result to json 44071 1727204637.94799: done dumping result, returning 44071 1727204637.94807: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-c964-7471-000000000d1e] 44071 1727204637.94811: sending task result for task 127b8e07-fff9-c964-7471-000000000d1e 44071 1727204637.94929: done sending task result for task 127b8e07-fff9-c964-7471-000000000d1e 44071 1727204637.94932: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44071 1727204637.95000: no more pending results, returning what we have 44071 1727204637.95004: results queue empty 44071 1727204637.95005: checking for any_errors_fatal 44071 1727204637.95012: done checking for any_errors_fatal 44071 1727204637.95012: checking for max_fail_percentage 44071 1727204637.95014: done checking for max_fail_percentage 44071 1727204637.95015: checking to see if all hosts have failed and the running result is not ok 44071 1727204637.95015: done checking to see if all hosts have failed 44071 1727204637.95016: getting the remaining hosts for this loop 44071 1727204637.95018: done getting the remaining hosts for this loop 44071 1727204637.95022: getting the next task for host managed-node2 44071 1727204637.95030: done getting next task for host managed-node2 44071 1727204637.95034: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204637.95039: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204637.95063: getting variables 44071 1727204637.95064: in VariableManager get_vars() 44071 1727204637.95110: Calling all_inventory to load vars for managed-node2 44071 1727204637.95113: Calling groups_inventory to load vars for managed-node2 44071 1727204637.95115: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204637.95126: Calling all_plugins_play to load vars for managed-node2 44071 1727204637.95128: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204637.95131: Calling groups_plugins_play to load vars for managed-node2 44071 1727204637.96172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204637.97411: done with get_vars() 44071 1727204637.97443: done getting variables 44071 1727204637.97498: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.108) 0:00:50.291 ***** 44071 1727204637.97526: entering _queue_task() for managed-node2/package 44071 1727204637.97830: worker is 1 (out of 1 available) 44071 1727204637.97847: exiting _queue_task() for managed-node2/package 44071 1727204637.97863: done queuing things up, now waiting for results queue to drain 44071 1727204637.97865: waiting for pending results... 44071 1727204637.98079: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204637.98218: in run() - task 127b8e07-fff9-c964-7471-000000000d1f 44071 1727204637.98237: variable 'ansible_search_path' from source: unknown 44071 1727204637.98246: variable 'ansible_search_path' from source: unknown 44071 1727204637.98270: calling self._execute() 44071 1727204637.98364: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204637.98372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204637.98381: variable 'omit' from source: magic vars 44071 1727204637.98722: variable 'ansible_distribution_major_version' from source: facts 44071 1727204637.98733: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204637.98833: variable 'network_state' from source: role '' defaults 44071 1727204637.98844: Evaluated conditional (network_state != {}): False 44071 1727204637.98848: when evaluation is False, skipping this task 44071 1727204637.98850: _execute() done 44071 1727204637.98853: dumping result to json 44071 1727204637.98856: done dumping result, returning 44071 1727204637.98863: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-c964-7471-000000000d1f] 44071 1727204637.98868: sending task result for task 127b8e07-fff9-c964-7471-000000000d1f 44071 1727204637.98986: done sending task result for task 127b8e07-fff9-c964-7471-000000000d1f 44071 1727204637.98990: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204637.99046: no more pending results, returning what we have 44071 1727204637.99050: results queue empty 44071 1727204637.99051: checking for any_errors_fatal 44071 1727204637.99058: done checking for any_errors_fatal 44071 1727204637.99059: checking for max_fail_percentage 44071 1727204637.99060: done checking for max_fail_percentage 44071 1727204637.99061: checking to see if all hosts have failed and the running result is not ok 44071 1727204637.99062: done checking to see if all hosts have failed 44071 1727204637.99063: getting the remaining hosts for this loop 44071 1727204637.99067: done getting the remaining hosts for this loop 44071 1727204637.99072: getting the next task for host managed-node2 44071 1727204637.99081: done getting next task for host managed-node2 44071 1727204637.99086: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204637.99092: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204637.99114: getting variables 44071 1727204637.99115: in VariableManager get_vars() 44071 1727204637.99156: Calling all_inventory to load vars for managed-node2 44071 1727204637.99159: Calling groups_inventory to load vars for managed-node2 44071 1727204637.99161: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204637.99178: Calling all_plugins_play to load vars for managed-node2 44071 1727204637.99181: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204637.99184: Calling groups_plugins_play to load vars for managed-node2 44071 1727204638.00476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204638.02179: done with get_vars() 44071 1727204638.02211: done getting variables 44071 1727204638.02264: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.047) 0:00:50.339 ***** 44071 1727204638.02296: entering _queue_task() for managed-node2/package 44071 1727204638.02597: worker is 1 (out of 1 available) 44071 1727204638.02615: exiting _queue_task() for managed-node2/package 44071 1727204638.02629: done queuing things up, now waiting for results queue to drain 44071 1727204638.02631: waiting for pending results... 44071 1727204638.02842: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204638.02971: in run() - task 127b8e07-fff9-c964-7471-000000000d20 44071 1727204638.02987: variable 'ansible_search_path' from source: unknown 44071 1727204638.02991: variable 'ansible_search_path' from source: unknown 44071 1727204638.03024: calling self._execute() 44071 1727204638.03115: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204638.03122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204638.03132: variable 'omit' from source: magic vars 44071 1727204638.03543: variable 'ansible_distribution_major_version' from source: facts 44071 1727204638.03548: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204638.03775: variable 'network_state' from source: role '' defaults 44071 1727204638.03986: Evaluated conditional (network_state != {}): False 44071 1727204638.03990: when evaluation is False, skipping this task 44071 1727204638.03994: _execute() done 44071 1727204638.03996: dumping result to json 44071 1727204638.04000: done dumping result, returning 44071 1727204638.04003: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-c964-7471-000000000d20] 44071 1727204638.04006: sending task result for task 127b8e07-fff9-c964-7471-000000000d20 44071 1727204638.04161: done sending task result for task 127b8e07-fff9-c964-7471-000000000d20 44071 1727204638.04163: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204638.04314: no more pending results, returning what we have 44071 1727204638.04318: results queue empty 44071 1727204638.04319: checking for any_errors_fatal 44071 1727204638.04324: done checking for any_errors_fatal 44071 1727204638.04325: checking for max_fail_percentage 44071 1727204638.04327: done checking for max_fail_percentage 44071 1727204638.04328: checking to see if all hosts have failed and the running result is not ok 44071 1727204638.04329: done checking to see if all hosts have failed 44071 1727204638.04329: getting the remaining hosts for this loop 44071 1727204638.04331: done getting the remaining hosts for this loop 44071 1727204638.04335: getting the next task for host managed-node2 44071 1727204638.04344: done getting next task for host managed-node2 44071 1727204638.04348: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204638.04354: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204638.04600: getting variables 44071 1727204638.04602: in VariableManager get_vars() 44071 1727204638.04641: Calling all_inventory to load vars for managed-node2 44071 1727204638.04644: Calling groups_inventory to load vars for managed-node2 44071 1727204638.04647: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204638.04657: Calling all_plugins_play to load vars for managed-node2 44071 1727204638.04660: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204638.04663: Calling groups_plugins_play to load vars for managed-node2 44071 1727204638.06671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204638.08977: done with get_vars() 44071 1727204638.09028: done getting variables 44071 1727204638.09103: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.068) 0:00:50.407 ***** 44071 1727204638.09154: entering _queue_task() for managed-node2/service 44071 1727204638.09787: worker is 1 (out of 1 available) 44071 1727204638.09802: exiting _queue_task() for managed-node2/service 44071 1727204638.09814: done queuing things up, now waiting for results queue to drain 44071 1727204638.09816: waiting for pending results... 44071 1727204638.10061: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204638.10151: in run() - task 127b8e07-fff9-c964-7471-000000000d21 44071 1727204638.10181: variable 'ansible_search_path' from source: unknown 44071 1727204638.10190: variable 'ansible_search_path' from source: unknown 44071 1727204638.10238: calling self._execute() 44071 1727204638.10361: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204638.10386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204638.10471: variable 'omit' from source: magic vars 44071 1727204638.10880: variable 'ansible_distribution_major_version' from source: facts 44071 1727204638.10901: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204638.11068: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204638.11296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204638.14103: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204638.14201: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204638.14244: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204638.14291: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204638.14327: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204638.14471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204638.14476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204638.14513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204638.14567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204638.14588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204638.14671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204638.14682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204638.14710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204638.14761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204638.14837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204638.14840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204638.14862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204638.14946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204638.14950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204638.14954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204638.15164: variable 'network_connections' from source: include params 44071 1727204638.15167: variable 'interface' from source: play vars 44071 1727204638.15243: variable 'interface' from source: play vars 44071 1727204638.15330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204638.15572: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204638.15597: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204638.15640: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204638.15682: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204638.15746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204638.15786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204638.15901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204638.15905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204638.15963: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204638.16297: variable 'network_connections' from source: include params 44071 1727204638.16309: variable 'interface' from source: play vars 44071 1727204638.16398: variable 'interface' from source: play vars 44071 1727204638.16432: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204638.16473: when evaluation is False, skipping this task 44071 1727204638.16479: _execute() done 44071 1727204638.16485: dumping result to json 44071 1727204638.16488: done dumping result, returning 44071 1727204638.16490: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000000d21] 44071 1727204638.16492: sending task result for task 127b8e07-fff9-c964-7471-000000000d21 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204638.16783: no more pending results, returning what we have 44071 1727204638.16788: results queue empty 44071 1727204638.16790: checking for any_errors_fatal 44071 1727204638.16799: done checking for any_errors_fatal 44071 1727204638.16800: checking for max_fail_percentage 44071 1727204638.16801: done checking for max_fail_percentage 44071 1727204638.16803: checking to see if all hosts have failed and the running result is not ok 44071 1727204638.16803: done checking to see if all hosts have failed 44071 1727204638.16804: getting the remaining hosts for this loop 44071 1727204638.16806: done getting the remaining hosts for this loop 44071 1727204638.16811: getting the next task for host managed-node2 44071 1727204638.16821: done getting next task for host managed-node2 44071 1727204638.16827: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204638.16833: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204638.16858: getting variables 44071 1727204638.16861: in VariableManager get_vars() 44071 1727204638.17027: Calling all_inventory to load vars for managed-node2 44071 1727204638.17030: Calling groups_inventory to load vars for managed-node2 44071 1727204638.17033: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204638.17104: Calling all_plugins_play to load vars for managed-node2 44071 1727204638.17109: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204638.17113: Calling groups_plugins_play to load vars for managed-node2 44071 1727204638.17687: done sending task result for task 127b8e07-fff9-c964-7471-000000000d21 44071 1727204638.17693: WORKER PROCESS EXITING 44071 1727204638.19949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204638.24750: done with get_vars() 44071 1727204638.24799: done getting variables 44071 1727204638.24886: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.157) 0:00:50.565 ***** 44071 1727204638.24925: entering _queue_task() for managed-node2/service 44071 1727204638.25355: worker is 1 (out of 1 available) 44071 1727204638.25509: exiting _queue_task() for managed-node2/service 44071 1727204638.25521: done queuing things up, now waiting for results queue to drain 44071 1727204638.25523: waiting for pending results... 44071 1727204638.25726: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204638.25911: in run() - task 127b8e07-fff9-c964-7471-000000000d22 44071 1727204638.25941: variable 'ansible_search_path' from source: unknown 44071 1727204638.25953: variable 'ansible_search_path' from source: unknown 44071 1727204638.26008: calling self._execute() 44071 1727204638.26133: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204638.26145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204638.26165: variable 'omit' from source: magic vars 44071 1727204638.26635: variable 'ansible_distribution_major_version' from source: facts 44071 1727204638.26655: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204638.26928: variable 'network_provider' from source: set_fact 44071 1727204638.26934: variable 'network_state' from source: role '' defaults 44071 1727204638.26936: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44071 1727204638.26939: variable 'omit' from source: magic vars 44071 1727204638.26983: variable 'omit' from source: magic vars 44071 1727204638.27016: variable 'network_service_name' from source: role '' defaults 44071 1727204638.27103: variable 'network_service_name' from source: role '' defaults 44071 1727204638.27230: variable '__network_provider_setup' from source: role '' defaults 44071 1727204638.27242: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204638.27322: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204638.27336: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204638.27415: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204638.27692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204638.30642: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204638.30731: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204638.30858: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204638.30862: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204638.30868: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204638.30963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204638.31007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204638.31042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204638.31173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204638.31178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204638.31181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204638.31215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204638.31247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204638.31303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204638.31371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204638.31601: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204638.31756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204638.31791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204638.31822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204638.31881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204638.31901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204638.32012: variable 'ansible_python' from source: facts 44071 1727204638.32061: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204638.32141: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204638.32238: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204638.32403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204638.32472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204638.32475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204638.32525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204638.32546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204638.32618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204638.32672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204638.32692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204638.32773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204638.32777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204638.32952: variable 'network_connections' from source: include params 44071 1727204638.32968: variable 'interface' from source: play vars 44071 1727204638.33064: variable 'interface' from source: play vars 44071 1727204638.33206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204638.33462: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204638.33531: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204638.33596: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204638.33646: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204638.33730: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204638.33770: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204638.33818: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204638.33857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204638.33931: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204638.34343: variable 'network_connections' from source: include params 44071 1727204638.34349: variable 'interface' from source: play vars 44071 1727204638.34384: variable 'interface' from source: play vars 44071 1727204638.34427: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204638.34530: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204638.34889: variable 'network_connections' from source: include params 44071 1727204638.34907: variable 'interface' from source: play vars 44071 1727204638.34990: variable 'interface' from source: play vars 44071 1727204638.35171: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204638.35174: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204638.35456: variable 'network_connections' from source: include params 44071 1727204638.35470: variable 'interface' from source: play vars 44071 1727204638.35556: variable 'interface' from source: play vars 44071 1727204638.35630: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204638.35700: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204638.35712: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204638.35785: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204638.36051: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204638.36635: variable 'network_connections' from source: include params 44071 1727204638.36648: variable 'interface' from source: play vars 44071 1727204638.36728: variable 'interface' from source: play vars 44071 1727204638.36741: variable 'ansible_distribution' from source: facts 44071 1727204638.36749: variable '__network_rh_distros' from source: role '' defaults 44071 1727204638.36759: variable 'ansible_distribution_major_version' from source: facts 44071 1727204638.36781: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204638.37035: variable 'ansible_distribution' from source: facts 44071 1727204638.37040: variable '__network_rh_distros' from source: role '' defaults 44071 1727204638.37043: variable 'ansible_distribution_major_version' from source: facts 44071 1727204638.37045: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204638.37252: variable 'ansible_distribution' from source: facts 44071 1727204638.37256: variable '__network_rh_distros' from source: role '' defaults 44071 1727204638.37258: variable 'ansible_distribution_major_version' from source: facts 44071 1727204638.37289: variable 'network_provider' from source: set_fact 44071 1727204638.37317: variable 'omit' from source: magic vars 44071 1727204638.37368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204638.37401: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204638.37471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204638.37492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204638.37494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204638.37570: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204638.37573: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204638.37575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204638.37656: Set connection var ansible_connection to ssh 44071 1727204638.37692: Set connection var ansible_timeout to 10 44071 1727204638.37695: Set connection var ansible_pipelining to False 44071 1727204638.37697: Set connection var ansible_shell_type to sh 44071 1727204638.37701: Set connection var ansible_shell_executable to /bin/sh 44071 1727204638.37771: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204638.37775: variable 'ansible_shell_executable' from source: unknown 44071 1727204638.37777: variable 'ansible_connection' from source: unknown 44071 1727204638.37779: variable 'ansible_module_compression' from source: unknown 44071 1727204638.37781: variable 'ansible_shell_type' from source: unknown 44071 1727204638.37783: variable 'ansible_shell_executable' from source: unknown 44071 1727204638.37785: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204638.37787: variable 'ansible_pipelining' from source: unknown 44071 1727204638.37789: variable 'ansible_timeout' from source: unknown 44071 1727204638.37800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204638.37931: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204638.37958: variable 'omit' from source: magic vars 44071 1727204638.37971: starting attempt loop 44071 1727204638.38019: running the handler 44071 1727204638.38094: variable 'ansible_facts' from source: unknown 44071 1727204638.39120: _low_level_execute_command(): starting 44071 1727204638.39140: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204638.40002: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204638.40057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204638.40092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204638.40096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204638.40219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204638.42013: stdout chunk (state=3): >>>/root <<< 44071 1727204638.42217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204638.42222: stdout chunk (state=3): >>><<< 44071 1727204638.42224: stderr chunk (state=3): >>><<< 44071 1727204638.42359: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204638.42363: _low_level_execute_command(): starting 44071 1727204638.42368: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204638.4225416-47042-37634942333930 `" && echo ansible-tmp-1727204638.4225416-47042-37634942333930="` echo /root/.ansible/tmp/ansible-tmp-1727204638.4225416-47042-37634942333930 `" ) && sleep 0' 44071 1727204638.42985: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204638.42995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204638.43006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204638.43030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204638.43037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204638.43040: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204638.43054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204638.43070: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204638.43204: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204638.43208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204638.43282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204638.45254: stdout chunk (state=3): >>>ansible-tmp-1727204638.4225416-47042-37634942333930=/root/.ansible/tmp/ansible-tmp-1727204638.4225416-47042-37634942333930 <<< 44071 1727204638.45384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204638.45439: stderr chunk (state=3): >>><<< 44071 1727204638.45445: stdout chunk (state=3): >>><<< 44071 1727204638.45463: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204638.4225416-47042-37634942333930=/root/.ansible/tmp/ansible-tmp-1727204638.4225416-47042-37634942333930 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204638.45501: variable 'ansible_module_compression' from source: unknown 44071 1727204638.45544: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44071 1727204638.45605: variable 'ansible_facts' from source: unknown 44071 1727204638.45746: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204638.4225416-47042-37634942333930/AnsiballZ_systemd.py 44071 1727204638.45876: Sending initial data 44071 1727204638.45880: Sent initial data (155 bytes) 44071 1727204638.46430: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204638.46441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204638.46516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204638.48111: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204638.48180: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204638.48242: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmphs1zxgxb /root/.ansible/tmp/ansible-tmp-1727204638.4225416-47042-37634942333930/AnsiballZ_systemd.py <<< 44071 1727204638.48250: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204638.4225416-47042-37634942333930/AnsiballZ_systemd.py" <<< 44071 1727204638.48310: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmphs1zxgxb" to remote "/root/.ansible/tmp/ansible-tmp-1727204638.4225416-47042-37634942333930/AnsiballZ_systemd.py" <<< 44071 1727204638.48317: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204638.4225416-47042-37634942333930/AnsiballZ_systemd.py" <<< 44071 1727204638.49606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204638.49806: stderr chunk (state=3): >>><<< 44071 1727204638.49810: stdout chunk (state=3): >>><<< 44071 1727204638.49813: done transferring module to remote 44071 1727204638.49815: _low_level_execute_command(): starting 44071 1727204638.49817: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204638.4225416-47042-37634942333930/ /root/.ansible/tmp/ansible-tmp-1727204638.4225416-47042-37634942333930/AnsiballZ_systemd.py && sleep 0' 44071 1727204638.50474: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204638.50478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204638.50481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204638.50483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204638.50485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204638.50540: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204638.50560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204638.50584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204638.50603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204638.50700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204638.52525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204638.52587: stderr chunk (state=3): >>><<< 44071 1727204638.52593: stdout chunk (state=3): >>><<< 44071 1727204638.52604: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204638.52608: _low_level_execute_command(): starting 44071 1727204638.52613: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204638.4225416-47042-37634942333930/AnsiballZ_systemd.py && sleep 0' 44071 1727204638.53300: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204638.53304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204638.53307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204638.53357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204638.53431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204638.85098: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4587520", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3522220032", "CPUUsageNSec": "1514194000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitC<<< 44071 1727204638.85136: stdout chunk (state=3): >>>ORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44071 1727204638.87064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204638.87072: stdout chunk (state=3): >>><<< 44071 1727204638.87074: stderr chunk (state=3): >>><<< 44071 1727204638.87275: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4587520", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3522220032", "CPUUsageNSec": "1514194000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204638.87358: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204638.4225416-47042-37634942333930/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204638.87387: _low_level_execute_command(): starting 44071 1727204638.87404: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204638.4225416-47042-37634942333930/ > /dev/null 2>&1 && sleep 0' 44071 1727204638.88182: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204638.88236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204638.88260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204638.88291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204638.88402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204638.90398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204638.90410: stdout chunk (state=3): >>><<< 44071 1727204638.90424: stderr chunk (state=3): >>><<< 44071 1727204638.90448: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204638.90571: handler run complete 44071 1727204638.90575: attempt loop complete, returning result 44071 1727204638.90577: _execute() done 44071 1727204638.90580: dumping result to json 44071 1727204638.90582: done dumping result, returning 44071 1727204638.90593: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-c964-7471-000000000d22] 44071 1727204638.90603: sending task result for task 127b8e07-fff9-c964-7471-000000000d22 44071 1727204638.91191: done sending task result for task 127b8e07-fff9-c964-7471-000000000d22 44071 1727204638.91195: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204638.91263: no more pending results, returning what we have 44071 1727204638.91268: results queue empty 44071 1727204638.91269: checking for any_errors_fatal 44071 1727204638.91274: done checking for any_errors_fatal 44071 1727204638.91274: checking for max_fail_percentage 44071 1727204638.91276: done checking for max_fail_percentage 44071 1727204638.91277: checking to see if all hosts have failed and the running result is not ok 44071 1727204638.91277: done checking to see if all hosts have failed 44071 1727204638.91278: getting the remaining hosts for this loop 44071 1727204638.91280: done getting the remaining hosts for this loop 44071 1727204638.91283: getting the next task for host managed-node2 44071 1727204638.91290: done getting next task for host managed-node2 44071 1727204638.91294: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204638.91299: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204638.91310: getting variables 44071 1727204638.91312: in VariableManager get_vars() 44071 1727204638.91342: Calling all_inventory to load vars for managed-node2 44071 1727204638.91345: Calling groups_inventory to load vars for managed-node2 44071 1727204638.91347: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204638.91435: Calling all_plugins_play to load vars for managed-node2 44071 1727204638.91440: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204638.91447: Calling groups_plugins_play to load vars for managed-node2 44071 1727204638.93308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204638.94547: done with get_vars() 44071 1727204638.94582: done getting variables 44071 1727204638.94632: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.697) 0:00:51.263 ***** 44071 1727204638.94671: entering _queue_task() for managed-node2/service 44071 1727204638.94975: worker is 1 (out of 1 available) 44071 1727204638.94993: exiting _queue_task() for managed-node2/service 44071 1727204638.95008: done queuing things up, now waiting for results queue to drain 44071 1727204638.95010: waiting for pending results... 44071 1727204638.95254: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204638.95398: in run() - task 127b8e07-fff9-c964-7471-000000000d23 44071 1727204638.95415: variable 'ansible_search_path' from source: unknown 44071 1727204638.95419: variable 'ansible_search_path' from source: unknown 44071 1727204638.95465: calling self._execute() 44071 1727204638.95599: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204638.95603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204638.95608: variable 'omit' from source: magic vars 44071 1727204638.96025: variable 'ansible_distribution_major_version' from source: facts 44071 1727204638.96038: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204638.96164: variable 'network_provider' from source: set_fact 44071 1727204638.96169: Evaluated conditional (network_provider == "nm"): True 44071 1727204638.96262: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204638.96385: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204638.96539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204638.98238: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204638.98292: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204638.98321: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204638.98353: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204638.98376: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204638.98592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204638.98616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204638.98634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204638.98669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204638.98679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204638.98719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204638.98737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204638.98755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204638.98786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204638.98801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204638.98830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204638.98879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204638.98883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204638.98916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204638.98927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204638.99119: variable 'network_connections' from source: include params 44071 1727204638.99122: variable 'interface' from source: play vars 44071 1727204638.99283: variable 'interface' from source: play vars 44071 1727204638.99286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204638.99439: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204638.99481: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204638.99515: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204638.99547: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204638.99595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204638.99618: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204638.99647: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204638.99673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204638.99753: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204639.00049: variable 'network_connections' from source: include params 44071 1727204639.00073: variable 'interface' from source: play vars 44071 1727204639.00150: variable 'interface' from source: play vars 44071 1727204639.00200: Evaluated conditional (__network_wpa_supplicant_required): False 44071 1727204639.00208: when evaluation is False, skipping this task 44071 1727204639.00215: _execute() done 44071 1727204639.00222: dumping result to json 44071 1727204639.00229: done dumping result, returning 44071 1727204639.00271: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-c964-7471-000000000d23] 44071 1727204639.00295: sending task result for task 127b8e07-fff9-c964-7471-000000000d23 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44071 1727204639.00463: no more pending results, returning what we have 44071 1727204639.00475: results queue empty 44071 1727204639.00477: checking for any_errors_fatal 44071 1727204639.00511: done checking for any_errors_fatal 44071 1727204639.00512: checking for max_fail_percentage 44071 1727204639.00514: done checking for max_fail_percentage 44071 1727204639.00515: checking to see if all hosts have failed and the running result is not ok 44071 1727204639.00516: done checking to see if all hosts have failed 44071 1727204639.00516: getting the remaining hosts for this loop 44071 1727204639.00518: done getting the remaining hosts for this loop 44071 1727204639.00523: getting the next task for host managed-node2 44071 1727204639.00534: done getting next task for host managed-node2 44071 1727204639.00538: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204639.00543: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204639.00564: getting variables 44071 1727204639.00573: in VariableManager get_vars() 44071 1727204639.00618: Calling all_inventory to load vars for managed-node2 44071 1727204639.00620: Calling groups_inventory to load vars for managed-node2 44071 1727204639.00622: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204639.00635: Calling all_plugins_play to load vars for managed-node2 44071 1727204639.00638: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204639.00640: Calling groups_plugins_play to load vars for managed-node2 44071 1727204639.01184: done sending task result for task 127b8e07-fff9-c964-7471-000000000d23 44071 1727204639.01189: WORKER PROCESS EXITING 44071 1727204639.01960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204639.03558: done with get_vars() 44071 1727204639.03591: done getting variables 44071 1727204639.03653: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.090) 0:00:51.353 ***** 44071 1727204639.03684: entering _queue_task() for managed-node2/service 44071 1727204639.04082: worker is 1 (out of 1 available) 44071 1727204639.04100: exiting _queue_task() for managed-node2/service 44071 1727204639.04114: done queuing things up, now waiting for results queue to drain 44071 1727204639.04116: waiting for pending results... 44071 1727204639.04489: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204639.04631: in run() - task 127b8e07-fff9-c964-7471-000000000d24 44071 1727204639.04636: variable 'ansible_search_path' from source: unknown 44071 1727204639.04639: variable 'ansible_search_path' from source: unknown 44071 1727204639.04699: calling self._execute() 44071 1727204639.04803: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204639.04811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204639.04828: variable 'omit' from source: magic vars 44071 1727204639.05230: variable 'ansible_distribution_major_version' from source: facts 44071 1727204639.05241: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204639.05339: variable 'network_provider' from source: set_fact 44071 1727204639.05346: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204639.05349: when evaluation is False, skipping this task 44071 1727204639.05352: _execute() done 44071 1727204639.05355: dumping result to json 44071 1727204639.05358: done dumping result, returning 44071 1727204639.05366: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-c964-7471-000000000d24] 44071 1727204639.05371: sending task result for task 127b8e07-fff9-c964-7471-000000000d24 44071 1727204639.05480: done sending task result for task 127b8e07-fff9-c964-7471-000000000d24 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204639.05528: no more pending results, returning what we have 44071 1727204639.05533: results queue empty 44071 1727204639.05534: checking for any_errors_fatal 44071 1727204639.05547: done checking for any_errors_fatal 44071 1727204639.05547: checking for max_fail_percentage 44071 1727204639.05549: done checking for max_fail_percentage 44071 1727204639.05550: checking to see if all hosts have failed and the running result is not ok 44071 1727204639.05551: done checking to see if all hosts have failed 44071 1727204639.05551: getting the remaining hosts for this loop 44071 1727204639.05553: done getting the remaining hosts for this loop 44071 1727204639.05558: getting the next task for host managed-node2 44071 1727204639.05568: done getting next task for host managed-node2 44071 1727204639.05572: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204639.05578: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204639.05600: getting variables 44071 1727204639.05601: in VariableManager get_vars() 44071 1727204639.05640: Calling all_inventory to load vars for managed-node2 44071 1727204639.05645: Calling groups_inventory to load vars for managed-node2 44071 1727204639.05648: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204639.05658: Calling all_plugins_play to load vars for managed-node2 44071 1727204639.05661: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204639.05663: Calling groups_plugins_play to load vars for managed-node2 44071 1727204639.05680: WORKER PROCESS EXITING 44071 1727204639.06710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204639.08605: done with get_vars() 44071 1727204639.08630: done getting variables 44071 1727204639.08692: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.050) 0:00:51.403 ***** 44071 1727204639.08722: entering _queue_task() for managed-node2/copy 44071 1727204639.09023: worker is 1 (out of 1 available) 44071 1727204639.09039: exiting _queue_task() for managed-node2/copy 44071 1727204639.09055: done queuing things up, now waiting for results queue to drain 44071 1727204639.09057: waiting for pending results... 44071 1727204639.09264: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204639.09376: in run() - task 127b8e07-fff9-c964-7471-000000000d25 44071 1727204639.09391: variable 'ansible_search_path' from source: unknown 44071 1727204639.09394: variable 'ansible_search_path' from source: unknown 44071 1727204639.09432: calling self._execute() 44071 1727204639.09522: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204639.09528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204639.09538: variable 'omit' from source: magic vars 44071 1727204639.09859: variable 'ansible_distribution_major_version' from source: facts 44071 1727204639.09871: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204639.09962: variable 'network_provider' from source: set_fact 44071 1727204639.09968: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204639.09972: when evaluation is False, skipping this task 44071 1727204639.09975: _execute() done 44071 1727204639.09978: dumping result to json 44071 1727204639.09980: done dumping result, returning 44071 1727204639.09989: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-c964-7471-000000000d25] 44071 1727204639.09994: sending task result for task 127b8e07-fff9-c964-7471-000000000d25 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44071 1727204639.10164: no more pending results, returning what we have 44071 1727204639.10170: results queue empty 44071 1727204639.10171: checking for any_errors_fatal 44071 1727204639.10183: done checking for any_errors_fatal 44071 1727204639.10184: checking for max_fail_percentage 44071 1727204639.10185: done checking for max_fail_percentage 44071 1727204639.10186: checking to see if all hosts have failed and the running result is not ok 44071 1727204639.10187: done checking to see if all hosts have failed 44071 1727204639.10188: getting the remaining hosts for this loop 44071 1727204639.10189: done getting the remaining hosts for this loop 44071 1727204639.10194: getting the next task for host managed-node2 44071 1727204639.10203: done getting next task for host managed-node2 44071 1727204639.10207: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204639.10213: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204639.10236: getting variables 44071 1727204639.10237: in VariableManager get_vars() 44071 1727204639.10284: Calling all_inventory to load vars for managed-node2 44071 1727204639.10287: Calling groups_inventory to load vars for managed-node2 44071 1727204639.10289: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204639.10295: done sending task result for task 127b8e07-fff9-c964-7471-000000000d25 44071 1727204639.10297: WORKER PROCESS EXITING 44071 1727204639.10307: Calling all_plugins_play to load vars for managed-node2 44071 1727204639.10310: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204639.10312: Calling groups_plugins_play to load vars for managed-node2 44071 1727204639.11369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204639.12590: done with get_vars() 44071 1727204639.12620: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.039) 0:00:51.443 ***** 44071 1727204639.12704: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204639.13002: worker is 1 (out of 1 available) 44071 1727204639.13018: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204639.13033: done queuing things up, now waiting for results queue to drain 44071 1727204639.13035: waiting for pending results... 44071 1727204639.13250: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204639.13362: in run() - task 127b8e07-fff9-c964-7471-000000000d26 44071 1727204639.13381: variable 'ansible_search_path' from source: unknown 44071 1727204639.13384: variable 'ansible_search_path' from source: unknown 44071 1727204639.13419: calling self._execute() 44071 1727204639.13507: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204639.13514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204639.13523: variable 'omit' from source: magic vars 44071 1727204639.13854: variable 'ansible_distribution_major_version' from source: facts 44071 1727204639.13864: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204639.13872: variable 'omit' from source: magic vars 44071 1727204639.13929: variable 'omit' from source: magic vars 44071 1727204639.14070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204639.15784: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204639.15836: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204639.15870: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204639.15900: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204639.15922: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204639.15999: variable 'network_provider' from source: set_fact 44071 1727204639.16105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204639.16128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204639.16150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204639.16181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204639.16192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204639.16256: variable 'omit' from source: magic vars 44071 1727204639.16346: variable 'omit' from source: magic vars 44071 1727204639.16427: variable 'network_connections' from source: include params 44071 1727204639.16441: variable 'interface' from source: play vars 44071 1727204639.16488: variable 'interface' from source: play vars 44071 1727204639.16605: variable 'omit' from source: magic vars 44071 1727204639.16612: variable '__lsr_ansible_managed' from source: task vars 44071 1727204639.16662: variable '__lsr_ansible_managed' from source: task vars 44071 1727204639.17154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44071 1727204639.17315: Loaded config def from plugin (lookup/template) 44071 1727204639.17319: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44071 1727204639.17342: File lookup term: get_ansible_managed.j2 44071 1727204639.17347: variable 'ansible_search_path' from source: unknown 44071 1727204639.17354: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44071 1727204639.17367: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44071 1727204639.17381: variable 'ansible_search_path' from source: unknown 44071 1727204639.21770: variable 'ansible_managed' from source: unknown 44071 1727204639.21901: variable 'omit' from source: magic vars 44071 1727204639.21926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204639.21955: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204639.21972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204639.21986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204639.21995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204639.22019: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204639.22023: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204639.22026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204639.22101: Set connection var ansible_connection to ssh 44071 1727204639.22107: Set connection var ansible_timeout to 10 44071 1727204639.22113: Set connection var ansible_pipelining to False 44071 1727204639.22118: Set connection var ansible_shell_type to sh 44071 1727204639.22123: Set connection var ansible_shell_executable to /bin/sh 44071 1727204639.22130: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204639.22151: variable 'ansible_shell_executable' from source: unknown 44071 1727204639.22154: variable 'ansible_connection' from source: unknown 44071 1727204639.22157: variable 'ansible_module_compression' from source: unknown 44071 1727204639.22159: variable 'ansible_shell_type' from source: unknown 44071 1727204639.22161: variable 'ansible_shell_executable' from source: unknown 44071 1727204639.22171: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204639.22175: variable 'ansible_pipelining' from source: unknown 44071 1727204639.22178: variable 'ansible_timeout' from source: unknown 44071 1727204639.22180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204639.22284: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204639.22304: variable 'omit' from source: magic vars 44071 1727204639.22307: starting attempt loop 44071 1727204639.22310: running the handler 44071 1727204639.22312: _low_level_execute_command(): starting 44071 1727204639.22320: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204639.22877: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204639.22881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204639.22884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204639.22920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204639.22936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204639.23018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204639.24773: stdout chunk (state=3): >>>/root <<< 44071 1727204639.25009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204639.25014: stderr chunk (state=3): >>><<< 44071 1727204639.25016: stdout chunk (state=3): >>><<< 44071 1727204639.25037: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204639.25058: _low_level_execute_command(): starting 44071 1727204639.25086: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204639.2504423-47070-6250602089607 `" && echo ansible-tmp-1727204639.2504423-47070-6250602089607="` echo /root/.ansible/tmp/ansible-tmp-1727204639.2504423-47070-6250602089607 `" ) && sleep 0' 44071 1727204639.25776: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204639.25800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204639.25819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204639.25848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204639.25987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204639.26027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204639.26063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204639.26079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204639.26193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204639.28355: stdout chunk (state=3): >>>ansible-tmp-1727204639.2504423-47070-6250602089607=/root/.ansible/tmp/ansible-tmp-1727204639.2504423-47070-6250602089607 <<< 44071 1727204639.28377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204639.28729: stderr chunk (state=3): >>><<< 44071 1727204639.28734: stdout chunk (state=3): >>><<< 44071 1727204639.28739: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204639.2504423-47070-6250602089607=/root/.ansible/tmp/ansible-tmp-1727204639.2504423-47070-6250602089607 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204639.28771: variable 'ansible_module_compression' from source: unknown 44071 1727204639.28943: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44071 1727204639.28949: variable 'ansible_facts' from source: unknown 44071 1727204639.29080: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204639.2504423-47070-6250602089607/AnsiballZ_network_connections.py 44071 1727204639.29302: Sending initial data 44071 1727204639.29305: Sent initial data (166 bytes) 44071 1727204639.29818: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204639.29852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204639.29861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204639.29864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204639.29900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204639.29912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204639.29995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204639.31632: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204639.31687: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204639.31762: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmphzcesims /root/.ansible/tmp/ansible-tmp-1727204639.2504423-47070-6250602089607/AnsiballZ_network_connections.py <<< 44071 1727204639.31767: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204639.2504423-47070-6250602089607/AnsiballZ_network_connections.py" <<< 44071 1727204639.31830: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmphzcesims" to remote "/root/.ansible/tmp/ansible-tmp-1727204639.2504423-47070-6250602089607/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204639.2504423-47070-6250602089607/AnsiballZ_network_connections.py" <<< 44071 1727204639.33092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204639.33316: stderr chunk (state=3): >>><<< 44071 1727204639.33320: stdout chunk (state=3): >>><<< 44071 1727204639.33323: done transferring module to remote 44071 1727204639.33325: _low_level_execute_command(): starting 44071 1727204639.33327: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204639.2504423-47070-6250602089607/ /root/.ansible/tmp/ansible-tmp-1727204639.2504423-47070-6250602089607/AnsiballZ_network_connections.py && sleep 0' 44071 1727204639.33925: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204639.33971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204639.33987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204639.34079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204639.34096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204639.34121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204639.34231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204639.36114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204639.36202: stderr chunk (state=3): >>><<< 44071 1727204639.36206: stdout chunk (state=3): >>><<< 44071 1727204639.36272: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204639.36276: _low_level_execute_command(): starting 44071 1727204639.36279: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204639.2504423-47070-6250602089607/AnsiballZ_network_connections.py && sleep 0' 44071 1727204639.36963: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204639.36988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204639.37004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204639.37204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204639.37283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204639.37317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204639.37335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204639.37356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204639.37475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204639.64831: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 8a139112-7ef3-44ae-a404-065d84fc2b3c skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44071 1727204639.66697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204639.66701: stdout chunk (state=3): >>><<< 44071 1727204639.66703: stderr chunk (state=3): >>><<< 44071 1727204639.66842: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 8a139112-7ef3-44ae-a404-065d84fc2b3c skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204639.66846: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204639.2504423-47070-6250602089607/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204639.66849: _low_level_execute_command(): starting 44071 1727204639.66851: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204639.2504423-47070-6250602089607/ > /dev/null 2>&1 && sleep 0' 44071 1727204639.67491: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204639.67518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204639.67533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204639.67586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204639.67660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204639.67683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204639.67721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204639.67818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204639.69973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204639.69977: stderr chunk (state=3): >>><<< 44071 1727204639.69980: stdout chunk (state=3): >>><<< 44071 1727204639.69982: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204639.69985: handler run complete 44071 1727204639.69987: attempt loop complete, returning result 44071 1727204639.69989: _execute() done 44071 1727204639.69991: dumping result to json 44071 1727204639.69993: done dumping result, returning 44071 1727204639.69996: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-c964-7471-000000000d26] 44071 1727204639.69998: sending task result for task 127b8e07-fff9-c964-7471-000000000d26 44071 1727204639.70082: done sending task result for task 127b8e07-fff9-c964-7471-000000000d26 44071 1727204639.70087: WORKER PROCESS EXITING ok: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 8a139112-7ef3-44ae-a404-065d84fc2b3c skipped because already active 44071 1727204639.70213: no more pending results, returning what we have 44071 1727204639.70217: results queue empty 44071 1727204639.70218: checking for any_errors_fatal 44071 1727204639.70229: done checking for any_errors_fatal 44071 1727204639.70230: checking for max_fail_percentage 44071 1727204639.70232: done checking for max_fail_percentage 44071 1727204639.70234: checking to see if all hosts have failed and the running result is not ok 44071 1727204639.70235: done checking to see if all hosts have failed 44071 1727204639.70235: getting the remaining hosts for this loop 44071 1727204639.70237: done getting the remaining hosts for this loop 44071 1727204639.70245: getting the next task for host managed-node2 44071 1727204639.70255: done getting next task for host managed-node2 44071 1727204639.70259: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204639.70264: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204639.70394: getting variables 44071 1727204639.70396: in VariableManager get_vars() 44071 1727204639.70438: Calling all_inventory to load vars for managed-node2 44071 1727204639.70443: Calling groups_inventory to load vars for managed-node2 44071 1727204639.70446: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204639.70457: Calling all_plugins_play to load vars for managed-node2 44071 1727204639.70460: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204639.70463: Calling groups_plugins_play to load vars for managed-node2 44071 1727204639.72816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204639.75176: done with get_vars() 44071 1727204639.75222: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.626) 0:00:52.069 ***** 44071 1727204639.75333: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204639.75873: worker is 1 (out of 1 available) 44071 1727204639.75886: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204639.75899: done queuing things up, now waiting for results queue to drain 44071 1727204639.75900: waiting for pending results... 44071 1727204639.76172: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204639.76348: in run() - task 127b8e07-fff9-c964-7471-000000000d27 44071 1727204639.76379: variable 'ansible_search_path' from source: unknown 44071 1727204639.76387: variable 'ansible_search_path' from source: unknown 44071 1727204639.76434: calling self._execute() 44071 1727204639.76557: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204639.76574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204639.76597: variable 'omit' from source: magic vars 44071 1727204639.77056: variable 'ansible_distribution_major_version' from source: facts 44071 1727204639.77129: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204639.77235: variable 'network_state' from source: role '' defaults 44071 1727204639.77258: Evaluated conditional (network_state != {}): False 44071 1727204639.77268: when evaluation is False, skipping this task 44071 1727204639.77356: _execute() done 44071 1727204639.77359: dumping result to json 44071 1727204639.77361: done dumping result, returning 44071 1727204639.77364: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-c964-7471-000000000d27] 44071 1727204639.77368: sending task result for task 127b8e07-fff9-c964-7471-000000000d27 44071 1727204639.77462: done sending task result for task 127b8e07-fff9-c964-7471-000000000d27 44071 1727204639.77465: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204639.77528: no more pending results, returning what we have 44071 1727204639.77533: results queue empty 44071 1727204639.77535: checking for any_errors_fatal 44071 1727204639.77553: done checking for any_errors_fatal 44071 1727204639.77554: checking for max_fail_percentage 44071 1727204639.77556: done checking for max_fail_percentage 44071 1727204639.77557: checking to see if all hosts have failed and the running result is not ok 44071 1727204639.77558: done checking to see if all hosts have failed 44071 1727204639.77559: getting the remaining hosts for this loop 44071 1727204639.77561: done getting the remaining hosts for this loop 44071 1727204639.77568: getting the next task for host managed-node2 44071 1727204639.77773: done getting next task for host managed-node2 44071 1727204639.77779: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204639.77785: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204639.77809: getting variables 44071 1727204639.77811: in VariableManager get_vars() 44071 1727204639.77857: Calling all_inventory to load vars for managed-node2 44071 1727204639.77860: Calling groups_inventory to load vars for managed-node2 44071 1727204639.77863: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204639.77882: Calling all_plugins_play to load vars for managed-node2 44071 1727204639.77886: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204639.77889: Calling groups_plugins_play to load vars for managed-node2 44071 1727204639.79888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204639.82406: done with get_vars() 44071 1727204639.82449: done getting variables 44071 1727204639.82522: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.072) 0:00:52.142 ***** 44071 1727204639.82577: entering _queue_task() for managed-node2/debug 44071 1727204639.83061: worker is 1 (out of 1 available) 44071 1727204639.83084: exiting _queue_task() for managed-node2/debug 44071 1727204639.83101: done queuing things up, now waiting for results queue to drain 44071 1727204639.83103: waiting for pending results... 44071 1727204639.83615: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204639.83622: in run() - task 127b8e07-fff9-c964-7471-000000000d28 44071 1727204639.83627: variable 'ansible_search_path' from source: unknown 44071 1727204639.83630: variable 'ansible_search_path' from source: unknown 44071 1727204639.83656: calling self._execute() 44071 1727204639.83784: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204639.83798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204639.83824: variable 'omit' from source: magic vars 44071 1727204639.84190: variable 'ansible_distribution_major_version' from source: facts 44071 1727204639.84201: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204639.84208: variable 'omit' from source: magic vars 44071 1727204639.84257: variable 'omit' from source: magic vars 44071 1727204639.84295: variable 'omit' from source: magic vars 44071 1727204639.84332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204639.84370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204639.84388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204639.84403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204639.84418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204639.84443: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204639.84450: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204639.84452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204639.84532: Set connection var ansible_connection to ssh 44071 1727204639.84538: Set connection var ansible_timeout to 10 44071 1727204639.84545: Set connection var ansible_pipelining to False 44071 1727204639.84551: Set connection var ansible_shell_type to sh 44071 1727204639.84557: Set connection var ansible_shell_executable to /bin/sh 44071 1727204639.84564: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204639.84588: variable 'ansible_shell_executable' from source: unknown 44071 1727204639.84591: variable 'ansible_connection' from source: unknown 44071 1727204639.84596: variable 'ansible_module_compression' from source: unknown 44071 1727204639.84598: variable 'ansible_shell_type' from source: unknown 44071 1727204639.84601: variable 'ansible_shell_executable' from source: unknown 44071 1727204639.84603: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204639.84605: variable 'ansible_pipelining' from source: unknown 44071 1727204639.84607: variable 'ansible_timeout' from source: unknown 44071 1727204639.84613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204639.84733: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204639.84744: variable 'omit' from source: magic vars 44071 1727204639.84751: starting attempt loop 44071 1727204639.84754: running the handler 44071 1727204639.84870: variable '__network_connections_result' from source: set_fact 44071 1727204639.84914: handler run complete 44071 1727204639.84929: attempt loop complete, returning result 44071 1727204639.84932: _execute() done 44071 1727204639.84937: dumping result to json 44071 1727204639.84939: done dumping result, returning 44071 1727204639.84952: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-c964-7471-000000000d28] 44071 1727204639.84955: sending task result for task 127b8e07-fff9-c964-7471-000000000d28 44071 1727204639.85057: done sending task result for task 127b8e07-fff9-c964-7471-000000000d28 44071 1727204639.85061: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 8a139112-7ef3-44ae-a404-065d84fc2b3c skipped because already active" ] } 44071 1727204639.85136: no more pending results, returning what we have 44071 1727204639.85140: results queue empty 44071 1727204639.85141: checking for any_errors_fatal 44071 1727204639.85150: done checking for any_errors_fatal 44071 1727204639.85151: checking for max_fail_percentage 44071 1727204639.85153: done checking for max_fail_percentage 44071 1727204639.85154: checking to see if all hosts have failed and the running result is not ok 44071 1727204639.85154: done checking to see if all hosts have failed 44071 1727204639.85155: getting the remaining hosts for this loop 44071 1727204639.85157: done getting the remaining hosts for this loop 44071 1727204639.85161: getting the next task for host managed-node2 44071 1727204639.85177: done getting next task for host managed-node2 44071 1727204639.85183: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204639.85187: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204639.85200: getting variables 44071 1727204639.85202: in VariableManager get_vars() 44071 1727204639.85239: Calling all_inventory to load vars for managed-node2 44071 1727204639.85242: Calling groups_inventory to load vars for managed-node2 44071 1727204639.85244: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204639.85255: Calling all_plugins_play to load vars for managed-node2 44071 1727204639.85257: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204639.85260: Calling groups_plugins_play to load vars for managed-node2 44071 1727204639.86839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204639.88177: done with get_vars() 44071 1727204639.88208: done getting variables 44071 1727204639.88262: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.057) 0:00:52.199 ***** 44071 1727204639.88299: entering _queue_task() for managed-node2/debug 44071 1727204639.88595: worker is 1 (out of 1 available) 44071 1727204639.88612: exiting _queue_task() for managed-node2/debug 44071 1727204639.88626: done queuing things up, now waiting for results queue to drain 44071 1727204639.88628: waiting for pending results... 44071 1727204639.88830: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204639.88944: in run() - task 127b8e07-fff9-c964-7471-000000000d29 44071 1727204639.88957: variable 'ansible_search_path' from source: unknown 44071 1727204639.88963: variable 'ansible_search_path' from source: unknown 44071 1727204639.88999: calling self._execute() 44071 1727204639.89088: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204639.89094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204639.89104: variable 'omit' from source: magic vars 44071 1727204639.89572: variable 'ansible_distribution_major_version' from source: facts 44071 1727204639.89576: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204639.89579: variable 'omit' from source: magic vars 44071 1727204639.89582: variable 'omit' from source: magic vars 44071 1727204639.89625: variable 'omit' from source: magic vars 44071 1727204639.89681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204639.89724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204639.89756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204639.89784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204639.89804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204639.89842: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204639.89853: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204639.89861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204639.89981: Set connection var ansible_connection to ssh 44071 1727204639.89995: Set connection var ansible_timeout to 10 44071 1727204639.90006: Set connection var ansible_pipelining to False 44071 1727204639.90015: Set connection var ansible_shell_type to sh 44071 1727204639.90027: Set connection var ansible_shell_executable to /bin/sh 44071 1727204639.90040: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204639.90171: variable 'ansible_shell_executable' from source: unknown 44071 1727204639.90174: variable 'ansible_connection' from source: unknown 44071 1727204639.90177: variable 'ansible_module_compression' from source: unknown 44071 1727204639.90179: variable 'ansible_shell_type' from source: unknown 44071 1727204639.90181: variable 'ansible_shell_executable' from source: unknown 44071 1727204639.90183: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204639.90185: variable 'ansible_pipelining' from source: unknown 44071 1727204639.90188: variable 'ansible_timeout' from source: unknown 44071 1727204639.90190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204639.90283: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204639.90302: variable 'omit' from source: magic vars 44071 1727204639.90312: starting attempt loop 44071 1727204639.90319: running the handler 44071 1727204639.90382: variable '__network_connections_result' from source: set_fact 44071 1727204639.90479: variable '__network_connections_result' from source: set_fact 44071 1727204639.90603: handler run complete 44071 1727204639.90641: attempt loop complete, returning result 44071 1727204639.90650: _execute() done 44071 1727204639.90658: dumping result to json 44071 1727204639.90670: done dumping result, returning 44071 1727204639.90685: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-c964-7471-000000000d29] 44071 1727204639.90696: sending task result for task 127b8e07-fff9-c964-7471-000000000d29 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 8a139112-7ef3-44ae-a404-065d84fc2b3c skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 8a139112-7ef3-44ae-a404-065d84fc2b3c skipped because already active" ] } } 44071 1727204639.91168: no more pending results, returning what we have 44071 1727204639.91173: results queue empty 44071 1727204639.91173: checking for any_errors_fatal 44071 1727204639.91180: done checking for any_errors_fatal 44071 1727204639.91181: checking for max_fail_percentage 44071 1727204639.91182: done checking for max_fail_percentage 44071 1727204639.91183: checking to see if all hosts have failed and the running result is not ok 44071 1727204639.91184: done checking to see if all hosts have failed 44071 1727204639.91185: getting the remaining hosts for this loop 44071 1727204639.91186: done getting the remaining hosts for this loop 44071 1727204639.91190: getting the next task for host managed-node2 44071 1727204639.91198: done getting next task for host managed-node2 44071 1727204639.91202: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204639.91207: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204639.91226: getting variables 44071 1727204639.91228: in VariableManager get_vars() 44071 1727204639.91275: Calling all_inventory to load vars for managed-node2 44071 1727204639.91278: Calling groups_inventory to load vars for managed-node2 44071 1727204639.91281: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204639.91293: Calling all_plugins_play to load vars for managed-node2 44071 1727204639.91309: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204639.91313: Calling groups_plugins_play to load vars for managed-node2 44071 1727204639.91303: done sending task result for task 127b8e07-fff9-c964-7471-000000000d29 44071 1727204639.91986: WORKER PROCESS EXITING 44071 1727204639.93248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204639.95612: done with get_vars() 44071 1727204639.95667: done getting variables 44071 1727204639.95737: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.074) 0:00:52.274 ***** 44071 1727204639.95780: entering _queue_task() for managed-node2/debug 44071 1727204639.96281: worker is 1 (out of 1 available) 44071 1727204639.96296: exiting _queue_task() for managed-node2/debug 44071 1727204639.96309: done queuing things up, now waiting for results queue to drain 44071 1727204639.96311: waiting for pending results... 44071 1727204639.96549: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204639.96741: in run() - task 127b8e07-fff9-c964-7471-000000000d2a 44071 1727204639.96770: variable 'ansible_search_path' from source: unknown 44071 1727204639.96784: variable 'ansible_search_path' from source: unknown 44071 1727204639.96832: calling self._execute() 44071 1727204639.96946: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204639.96960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204639.96976: variable 'omit' from source: magic vars 44071 1727204639.97397: variable 'ansible_distribution_major_version' from source: facts 44071 1727204639.97417: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204639.97559: variable 'network_state' from source: role '' defaults 44071 1727204639.97580: Evaluated conditional (network_state != {}): False 44071 1727204639.97589: when evaluation is False, skipping this task 44071 1727204639.97595: _execute() done 44071 1727204639.97602: dumping result to json 44071 1727204639.97648: done dumping result, returning 44071 1727204639.97652: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-c964-7471-000000000d2a] 44071 1727204639.97654: sending task result for task 127b8e07-fff9-c964-7471-000000000d2a skipping: [managed-node2] => { "false_condition": "network_state != {}" } 44071 1727204639.98013: no more pending results, returning what we have 44071 1727204639.98017: results queue empty 44071 1727204639.98018: checking for any_errors_fatal 44071 1727204639.98028: done checking for any_errors_fatal 44071 1727204639.98029: checking for max_fail_percentage 44071 1727204639.98030: done checking for max_fail_percentage 44071 1727204639.98031: checking to see if all hosts have failed and the running result is not ok 44071 1727204639.98032: done checking to see if all hosts have failed 44071 1727204639.98033: getting the remaining hosts for this loop 44071 1727204639.98034: done getting the remaining hosts for this loop 44071 1727204639.98039: getting the next task for host managed-node2 44071 1727204639.98046: done getting next task for host managed-node2 44071 1727204639.98051: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204639.98056: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204639.98078: getting variables 44071 1727204639.98079: in VariableManager get_vars() 44071 1727204639.98119: Calling all_inventory to load vars for managed-node2 44071 1727204639.98122: Calling groups_inventory to load vars for managed-node2 44071 1727204639.98124: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204639.98136: Calling all_plugins_play to load vars for managed-node2 44071 1727204639.98139: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204639.98142: Calling groups_plugins_play to load vars for managed-node2 44071 1727204639.98705: done sending task result for task 127b8e07-fff9-c964-7471-000000000d2a 44071 1727204639.98709: WORKER PROCESS EXITING 44071 1727204640.00255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204640.02626: done with get_vars() 44071 1727204640.02662: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.069) 0:00:52.344 ***** 44071 1727204640.02781: entering _queue_task() for managed-node2/ping 44071 1727204640.03284: worker is 1 (out of 1 available) 44071 1727204640.03300: exiting _queue_task() for managed-node2/ping 44071 1727204640.03313: done queuing things up, now waiting for results queue to drain 44071 1727204640.03315: waiting for pending results... 44071 1727204640.03557: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204640.03748: in run() - task 127b8e07-fff9-c964-7471-000000000d2b 44071 1727204640.03775: variable 'ansible_search_path' from source: unknown 44071 1727204640.03786: variable 'ansible_search_path' from source: unknown 44071 1727204640.03830: calling self._execute() 44071 1727204640.03946: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204640.03960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204640.03981: variable 'omit' from source: magic vars 44071 1727204640.04411: variable 'ansible_distribution_major_version' from source: facts 44071 1727204640.04438: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204640.04452: variable 'omit' from source: magic vars 44071 1727204640.04544: variable 'omit' from source: magic vars 44071 1727204640.04594: variable 'omit' from source: magic vars 44071 1727204640.04649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204640.04699: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204640.04729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204640.04765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204640.04778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204640.04873: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204640.04876: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204640.04879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204640.04947: Set connection var ansible_connection to ssh 44071 1727204640.04960: Set connection var ansible_timeout to 10 44071 1727204640.04974: Set connection var ansible_pipelining to False 44071 1727204640.04993: Set connection var ansible_shell_type to sh 44071 1727204640.05005: Set connection var ansible_shell_executable to /bin/sh 44071 1727204640.05019: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204640.05050: variable 'ansible_shell_executable' from source: unknown 44071 1727204640.05059: variable 'ansible_connection' from source: unknown 44071 1727204640.05092: variable 'ansible_module_compression' from source: unknown 44071 1727204640.05095: variable 'ansible_shell_type' from source: unknown 44071 1727204640.05097: variable 'ansible_shell_executable' from source: unknown 44071 1727204640.05099: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204640.05101: variable 'ansible_pipelining' from source: unknown 44071 1727204640.05104: variable 'ansible_timeout' from source: unknown 44071 1727204640.05106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204640.05342: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204640.05378: variable 'omit' from source: magic vars 44071 1727204640.05382: starting attempt loop 44071 1727204640.05385: running the handler 44071 1727204640.05418: _low_level_execute_command(): starting 44071 1727204640.05422: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204640.06273: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204640.06297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.06344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204640.06372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204640.06489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204640.08268: stdout chunk (state=3): >>>/root <<< 44071 1727204640.08388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204640.08489: stderr chunk (state=3): >>><<< 44071 1727204640.08511: stdout chunk (state=3): >>><<< 44071 1727204640.08639: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204640.08643: _low_level_execute_command(): starting 44071 1727204640.08647: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204640.085337-47100-123461909859236 `" && echo ansible-tmp-1727204640.085337-47100-123461909859236="` echo /root/.ansible/tmp/ansible-tmp-1727204640.085337-47100-123461909859236 `" ) && sleep 0' 44071 1727204640.09263: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204640.09284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204640.09402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.09426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204640.09457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204640.09572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204640.11652: stdout chunk (state=3): >>>ansible-tmp-1727204640.085337-47100-123461909859236=/root/.ansible/tmp/ansible-tmp-1727204640.085337-47100-123461909859236 <<< 44071 1727204640.11879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204640.11884: stdout chunk (state=3): >>><<< 44071 1727204640.11886: stderr chunk (state=3): >>><<< 44071 1727204640.11889: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204640.085337-47100-123461909859236=/root/.ansible/tmp/ansible-tmp-1727204640.085337-47100-123461909859236 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204640.11922: variable 'ansible_module_compression' from source: unknown 44071 1727204640.11982: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44071 1727204640.12032: variable 'ansible_facts' from source: unknown 44071 1727204640.12171: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204640.085337-47100-123461909859236/AnsiballZ_ping.py 44071 1727204640.12376: Sending initial data 44071 1727204640.12427: Sent initial data (152 bytes) 44071 1727204640.12905: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.12910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204640.12935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.12938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.12944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.12999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204640.13003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204640.13013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204640.13093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204640.14769: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204640.14864: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204640.14946: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmph73geb0r /root/.ansible/tmp/ansible-tmp-1727204640.085337-47100-123461909859236/AnsiballZ_ping.py <<< 44071 1727204640.14950: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204640.085337-47100-123461909859236/AnsiballZ_ping.py" <<< 44071 1727204640.14997: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmph73geb0r" to remote "/root/.ansible/tmp/ansible-tmp-1727204640.085337-47100-123461909859236/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204640.085337-47100-123461909859236/AnsiballZ_ping.py" <<< 44071 1727204640.15905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204640.15939: stderr chunk (state=3): >>><<< 44071 1727204640.15952: stdout chunk (state=3): >>><<< 44071 1727204640.15986: done transferring module to remote 44071 1727204640.16013: _low_level_execute_command(): starting 44071 1727204640.16057: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204640.085337-47100-123461909859236/ /root/.ansible/tmp/ansible-tmp-1727204640.085337-47100-123461909859236/AnsiballZ_ping.py && sleep 0' 44071 1727204640.16614: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.16631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.16650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.16703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204640.16710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204640.16712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204640.16787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204640.18874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204640.18879: stdout chunk (state=3): >>><<< 44071 1727204640.18887: stderr chunk (state=3): >>><<< 44071 1727204640.18891: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204640.18893: _low_level_execute_command(): starting 44071 1727204640.18895: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204640.085337-47100-123461909859236/AnsiballZ_ping.py && sleep 0' 44071 1727204640.19455: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.19462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.19487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204640.19491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.19556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204640.19560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204640.19639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204640.35930: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44071 1727204640.37295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204640.37357: stderr chunk (state=3): >>><<< 44071 1727204640.37360: stdout chunk (state=3): >>><<< 44071 1727204640.37380: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204640.37407: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204640.085337-47100-123461909859236/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204640.37418: _low_level_execute_command(): starting 44071 1727204640.37423: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204640.085337-47100-123461909859236/ > /dev/null 2>&1 && sleep 0' 44071 1727204640.37933: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.37939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204640.37945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.38002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204640.38006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204640.38013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204640.38091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204640.40027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204640.40087: stderr chunk (state=3): >>><<< 44071 1727204640.40091: stdout chunk (state=3): >>><<< 44071 1727204640.40105: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204640.40113: handler run complete 44071 1727204640.40131: attempt loop complete, returning result 44071 1727204640.40135: _execute() done 44071 1727204640.40137: dumping result to json 44071 1727204640.40145: done dumping result, returning 44071 1727204640.40152: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-c964-7471-000000000d2b] 44071 1727204640.40157: sending task result for task 127b8e07-fff9-c964-7471-000000000d2b 44071 1727204640.40258: done sending task result for task 127b8e07-fff9-c964-7471-000000000d2b 44071 1727204640.40261: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 44071 1727204640.40338: no more pending results, returning what we have 44071 1727204640.40344: results queue empty 44071 1727204640.40345: checking for any_errors_fatal 44071 1727204640.40355: done checking for any_errors_fatal 44071 1727204640.40356: checking for max_fail_percentage 44071 1727204640.40358: done checking for max_fail_percentage 44071 1727204640.40359: checking to see if all hosts have failed and the running result is not ok 44071 1727204640.40359: done checking to see if all hosts have failed 44071 1727204640.40360: getting the remaining hosts for this loop 44071 1727204640.40362: done getting the remaining hosts for this loop 44071 1727204640.40372: getting the next task for host managed-node2 44071 1727204640.40386: done getting next task for host managed-node2 44071 1727204640.40390: ^ task is: TASK: meta (role_complete) 44071 1727204640.40395: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204640.40407: getting variables 44071 1727204640.40408: in VariableManager get_vars() 44071 1727204640.40451: Calling all_inventory to load vars for managed-node2 44071 1727204640.40454: Calling groups_inventory to load vars for managed-node2 44071 1727204640.40456: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204640.40468: Calling all_plugins_play to load vars for managed-node2 44071 1727204640.40471: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204640.40474: Calling groups_plugins_play to load vars for managed-node2 44071 1727204640.41517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204640.42741: done with get_vars() 44071 1727204640.42781: done getting variables 44071 1727204640.42853: done queuing things up, now waiting for results queue to drain 44071 1727204640.42855: results queue empty 44071 1727204640.42856: checking for any_errors_fatal 44071 1727204640.42859: done checking for any_errors_fatal 44071 1727204640.42860: checking for max_fail_percentage 44071 1727204640.42860: done checking for max_fail_percentage 44071 1727204640.42861: checking to see if all hosts have failed and the running result is not ok 44071 1727204640.42861: done checking to see if all hosts have failed 44071 1727204640.42862: getting the remaining hosts for this loop 44071 1727204640.42863: done getting the remaining hosts for this loop 44071 1727204640.42867: getting the next task for host managed-node2 44071 1727204640.42872: done getting next task for host managed-node2 44071 1727204640.42874: ^ task is: TASK: Asserts 44071 1727204640.42876: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204640.42879: getting variables 44071 1727204640.42879: in VariableManager get_vars() 44071 1727204640.42889: Calling all_inventory to load vars for managed-node2 44071 1727204640.42891: Calling groups_inventory to load vars for managed-node2 44071 1727204640.42892: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204640.42896: Calling all_plugins_play to load vars for managed-node2 44071 1727204640.42898: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204640.42900: Calling groups_plugins_play to load vars for managed-node2 44071 1727204640.43886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204640.45097: done with get_vars() 44071 1727204640.45128: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.424) 0:00:52.768 ***** 44071 1727204640.45199: entering _queue_task() for managed-node2/include_tasks 44071 1727204640.45577: worker is 1 (out of 1 available) 44071 1727204640.45593: exiting _queue_task() for managed-node2/include_tasks 44071 1727204640.45608: done queuing things up, now waiting for results queue to drain 44071 1727204640.45610: waiting for pending results... 44071 1727204640.45811: running TaskExecutor() for managed-node2/TASK: Asserts 44071 1727204640.45904: in run() - task 127b8e07-fff9-c964-7471-000000000a4e 44071 1727204640.45916: variable 'ansible_search_path' from source: unknown 44071 1727204640.45920: variable 'ansible_search_path' from source: unknown 44071 1727204640.45966: variable 'lsr_assert' from source: include params 44071 1727204640.46151: variable 'lsr_assert' from source: include params 44071 1727204640.46213: variable 'omit' from source: magic vars 44071 1727204640.46332: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204640.46340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204640.46350: variable 'omit' from source: magic vars 44071 1727204640.46546: variable 'ansible_distribution_major_version' from source: facts 44071 1727204640.46553: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204640.46560: variable 'item' from source: unknown 44071 1727204640.46613: variable 'item' from source: unknown 44071 1727204640.46639: variable 'item' from source: unknown 44071 1727204640.46687: variable 'item' from source: unknown 44071 1727204640.46840: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204640.46847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204640.46850: variable 'omit' from source: magic vars 44071 1727204640.46922: variable 'ansible_distribution_major_version' from source: facts 44071 1727204640.46926: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204640.46933: variable 'item' from source: unknown 44071 1727204640.46983: variable 'item' from source: unknown 44071 1727204640.47007: variable 'item' from source: unknown 44071 1727204640.47050: variable 'item' from source: unknown 44071 1727204640.47128: dumping result to json 44071 1727204640.47130: done dumping result, returning 44071 1727204640.47133: done running TaskExecutor() for managed-node2/TASK: Asserts [127b8e07-fff9-c964-7471-000000000a4e] 44071 1727204640.47135: sending task result for task 127b8e07-fff9-c964-7471-000000000a4e 44071 1727204640.47179: done sending task result for task 127b8e07-fff9-c964-7471-000000000a4e 44071 1727204640.47182: WORKER PROCESS EXITING 44071 1727204640.47210: no more pending results, returning what we have 44071 1727204640.47215: in VariableManager get_vars() 44071 1727204640.47258: Calling all_inventory to load vars for managed-node2 44071 1727204640.47261: Calling groups_inventory to load vars for managed-node2 44071 1727204640.47270: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204640.47285: Calling all_plugins_play to load vars for managed-node2 44071 1727204640.47289: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204640.47296: Calling groups_plugins_play to load vars for managed-node2 44071 1727204640.48387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204640.49710: done with get_vars() 44071 1727204640.49729: variable 'ansible_search_path' from source: unknown 44071 1727204640.49730: variable 'ansible_search_path' from source: unknown 44071 1727204640.49771: variable 'ansible_search_path' from source: unknown 44071 1727204640.49772: variable 'ansible_search_path' from source: unknown 44071 1727204640.49794: we have included files to process 44071 1727204640.49795: generating all_blocks data 44071 1727204640.49797: done generating all_blocks data 44071 1727204640.49802: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 44071 1727204640.49803: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 44071 1727204640.49804: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 44071 1727204640.49892: in VariableManager get_vars() 44071 1727204640.49908: done with get_vars() 44071 1727204640.50000: done processing included file 44071 1727204640.50002: iterating over new_blocks loaded from include file 44071 1727204640.50003: in VariableManager get_vars() 44071 1727204640.50013: done with get_vars() 44071 1727204640.50014: filtering new block on tags 44071 1727204640.50040: done filtering new block on tags 44071 1727204640.50043: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 => (item=tasks/assert_device_present.yml) 44071 1727204640.50048: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 44071 1727204640.50048: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 44071 1727204640.50051: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 44071 1727204640.50127: in VariableManager get_vars() 44071 1727204640.50140: done with get_vars() 44071 1727204640.50311: done processing included file 44071 1727204640.50312: iterating over new_blocks loaded from include file 44071 1727204640.50313: in VariableManager get_vars() 44071 1727204640.50324: done with get_vars() 44071 1727204640.50326: filtering new block on tags 44071 1727204640.50360: done filtering new block on tags 44071 1727204640.50362: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=tasks/assert_profile_present.yml) 44071 1727204640.50367: extending task lists for all hosts with included blocks 44071 1727204640.56049: done extending task lists 44071 1727204640.56050: done processing included files 44071 1727204640.56051: results queue empty 44071 1727204640.56052: checking for any_errors_fatal 44071 1727204640.56053: done checking for any_errors_fatal 44071 1727204640.56054: checking for max_fail_percentage 44071 1727204640.56055: done checking for max_fail_percentage 44071 1727204640.56056: checking to see if all hosts have failed and the running result is not ok 44071 1727204640.56057: done checking to see if all hosts have failed 44071 1727204640.56058: getting the remaining hosts for this loop 44071 1727204640.56059: done getting the remaining hosts for this loop 44071 1727204640.56060: getting the next task for host managed-node2 44071 1727204640.56064: done getting next task for host managed-node2 44071 1727204640.56067: ^ task is: TASK: Include the task 'get_interface_stat.yml' 44071 1727204640.56070: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204640.56078: getting variables 44071 1727204640.56079: in VariableManager get_vars() 44071 1727204640.56089: Calling all_inventory to load vars for managed-node2 44071 1727204640.56091: Calling groups_inventory to load vars for managed-node2 44071 1727204640.56093: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204640.56099: Calling all_plugins_play to load vars for managed-node2 44071 1727204640.56100: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204640.56102: Calling groups_plugins_play to load vars for managed-node2 44071 1727204640.56972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204640.58183: done with get_vars() 44071 1727204640.58212: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.130) 0:00:52.899 ***** 44071 1727204640.58283: entering _queue_task() for managed-node2/include_tasks 44071 1727204640.58597: worker is 1 (out of 1 available) 44071 1727204640.58612: exiting _queue_task() for managed-node2/include_tasks 44071 1727204640.58626: done queuing things up, now waiting for results queue to drain 44071 1727204640.58628: waiting for pending results... 44071 1727204640.58847: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 44071 1727204640.58937: in run() - task 127b8e07-fff9-c964-7471-000000000e86 44071 1727204640.58950: variable 'ansible_search_path' from source: unknown 44071 1727204640.58954: variable 'ansible_search_path' from source: unknown 44071 1727204640.58989: calling self._execute() 44071 1727204640.59079: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204640.59086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204640.59095: variable 'omit' from source: magic vars 44071 1727204640.59438: variable 'ansible_distribution_major_version' from source: facts 44071 1727204640.59453: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204640.59457: _execute() done 44071 1727204640.59460: dumping result to json 44071 1727204640.59463: done dumping result, returning 44071 1727204640.59472: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-c964-7471-000000000e86] 44071 1727204640.59478: sending task result for task 127b8e07-fff9-c964-7471-000000000e86 44071 1727204640.59585: done sending task result for task 127b8e07-fff9-c964-7471-000000000e86 44071 1727204640.59588: WORKER PROCESS EXITING 44071 1727204640.59617: no more pending results, returning what we have 44071 1727204640.59622: in VariableManager get_vars() 44071 1727204640.59669: Calling all_inventory to load vars for managed-node2 44071 1727204640.59672: Calling groups_inventory to load vars for managed-node2 44071 1727204640.59676: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204640.59691: Calling all_plugins_play to load vars for managed-node2 44071 1727204640.59694: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204640.59697: Calling groups_plugins_play to load vars for managed-node2 44071 1727204640.60855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204640.62094: done with get_vars() 44071 1727204640.62118: variable 'ansible_search_path' from source: unknown 44071 1727204640.62119: variable 'ansible_search_path' from source: unknown 44071 1727204640.62129: variable 'item' from source: include params 44071 1727204640.62229: variable 'item' from source: include params 44071 1727204640.62261: we have included files to process 44071 1727204640.62262: generating all_blocks data 44071 1727204640.62266: done generating all_blocks data 44071 1727204640.62268: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204640.62269: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204640.62271: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204640.62421: done processing included file 44071 1727204640.62424: iterating over new_blocks loaded from include file 44071 1727204640.62425: in VariableManager get_vars() 44071 1727204640.62439: done with get_vars() 44071 1727204640.62440: filtering new block on tags 44071 1727204640.62461: done filtering new block on tags 44071 1727204640.62463: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 44071 1727204640.62468: extending task lists for all hosts with included blocks 44071 1727204640.62583: done extending task lists 44071 1727204640.62584: done processing included files 44071 1727204640.62584: results queue empty 44071 1727204640.62585: checking for any_errors_fatal 44071 1727204640.62588: done checking for any_errors_fatal 44071 1727204640.62589: checking for max_fail_percentage 44071 1727204640.62589: done checking for max_fail_percentage 44071 1727204640.62590: checking to see if all hosts have failed and the running result is not ok 44071 1727204640.62591: done checking to see if all hosts have failed 44071 1727204640.62591: getting the remaining hosts for this loop 44071 1727204640.62592: done getting the remaining hosts for this loop 44071 1727204640.62594: getting the next task for host managed-node2 44071 1727204640.62598: done getting next task for host managed-node2 44071 1727204640.62600: ^ task is: TASK: Get stat for interface {{ interface }} 44071 1727204640.62602: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204640.62605: getting variables 44071 1727204640.62606: in VariableManager get_vars() 44071 1727204640.62614: Calling all_inventory to load vars for managed-node2 44071 1727204640.62616: Calling groups_inventory to load vars for managed-node2 44071 1727204640.62618: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204640.62622: Calling all_plugins_play to load vars for managed-node2 44071 1727204640.62624: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204640.62626: Calling groups_plugins_play to load vars for managed-node2 44071 1727204640.63537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204640.64825: done with get_vars() 44071 1727204640.64848: done getting variables 44071 1727204640.64959: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.067) 0:00:52.966 ***** 44071 1727204640.64989: entering _queue_task() for managed-node2/stat 44071 1727204640.65293: worker is 1 (out of 1 available) 44071 1727204640.65308: exiting _queue_task() for managed-node2/stat 44071 1727204640.65322: done queuing things up, now waiting for results queue to drain 44071 1727204640.65324: waiting for pending results... 44071 1727204640.65529: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 44071 1727204640.65622: in run() - task 127b8e07-fff9-c964-7471-000000000ef5 44071 1727204640.65638: variable 'ansible_search_path' from source: unknown 44071 1727204640.65643: variable 'ansible_search_path' from source: unknown 44071 1727204640.65679: calling self._execute() 44071 1727204640.65761: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204640.65768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204640.65782: variable 'omit' from source: magic vars 44071 1727204640.66097: variable 'ansible_distribution_major_version' from source: facts 44071 1727204640.66110: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204640.66114: variable 'omit' from source: magic vars 44071 1727204640.66164: variable 'omit' from source: magic vars 44071 1727204640.66243: variable 'interface' from source: play vars 44071 1727204640.66257: variable 'omit' from source: magic vars 44071 1727204640.66296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204640.66327: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204640.66436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204640.66441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204640.66446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204640.66450: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204640.66452: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204640.66455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204640.66485: Set connection var ansible_connection to ssh 44071 1727204640.66490: Set connection var ansible_timeout to 10 44071 1727204640.66496: Set connection var ansible_pipelining to False 44071 1727204640.66502: Set connection var ansible_shell_type to sh 44071 1727204640.66507: Set connection var ansible_shell_executable to /bin/sh 44071 1727204640.66514: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204640.66536: variable 'ansible_shell_executable' from source: unknown 44071 1727204640.66539: variable 'ansible_connection' from source: unknown 44071 1727204640.66545: variable 'ansible_module_compression' from source: unknown 44071 1727204640.66548: variable 'ansible_shell_type' from source: unknown 44071 1727204640.66552: variable 'ansible_shell_executable' from source: unknown 44071 1727204640.66554: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204640.66556: variable 'ansible_pipelining' from source: unknown 44071 1727204640.66559: variable 'ansible_timeout' from source: unknown 44071 1727204640.66562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204640.66725: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204640.66735: variable 'omit' from source: magic vars 44071 1727204640.66740: starting attempt loop 44071 1727204640.66746: running the handler 44071 1727204640.66756: _low_level_execute_command(): starting 44071 1727204640.66763: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204640.67327: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.67332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.67336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.67399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204640.67405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204640.67486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204640.69260: stdout chunk (state=3): >>>/root <<< 44071 1727204640.69387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204640.69434: stderr chunk (state=3): >>><<< 44071 1727204640.69438: stdout chunk (state=3): >>><<< 44071 1727204640.69464: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204640.69480: _low_level_execute_command(): starting 44071 1727204640.69488: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204640.6946332-47121-33818863694272 `" && echo ansible-tmp-1727204640.6946332-47121-33818863694272="` echo /root/.ansible/tmp/ansible-tmp-1727204640.6946332-47121-33818863694272 `" ) && sleep 0' 44071 1727204640.69998: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.70002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.70013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.70067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204640.70078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204640.70081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204640.70142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204640.72134: stdout chunk (state=3): >>>ansible-tmp-1727204640.6946332-47121-33818863694272=/root/.ansible/tmp/ansible-tmp-1727204640.6946332-47121-33818863694272 <<< 44071 1727204640.72247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204640.72312: stderr chunk (state=3): >>><<< 44071 1727204640.72315: stdout chunk (state=3): >>><<< 44071 1727204640.72334: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204640.6946332-47121-33818863694272=/root/.ansible/tmp/ansible-tmp-1727204640.6946332-47121-33818863694272 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204640.72385: variable 'ansible_module_compression' from source: unknown 44071 1727204640.72438: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44071 1727204640.72470: variable 'ansible_facts' from source: unknown 44071 1727204640.72535: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204640.6946332-47121-33818863694272/AnsiballZ_stat.py 44071 1727204640.72653: Sending initial data 44071 1727204640.72657: Sent initial data (152 bytes) 44071 1727204640.73163: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.73170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.73172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.73234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204640.73239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204640.73241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204640.73304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204640.74931: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204640.74996: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204640.75075: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbg1k6jcw /root/.ansible/tmp/ansible-tmp-1727204640.6946332-47121-33818863694272/AnsiballZ_stat.py <<< 44071 1727204640.75078: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204640.6946332-47121-33818863694272/AnsiballZ_stat.py" <<< 44071 1727204640.75140: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbg1k6jcw" to remote "/root/.ansible/tmp/ansible-tmp-1727204640.6946332-47121-33818863694272/AnsiballZ_stat.py" <<< 44071 1727204640.75148: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204640.6946332-47121-33818863694272/AnsiballZ_stat.py" <<< 44071 1727204640.75794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204640.75874: stderr chunk (state=3): >>><<< 44071 1727204640.75878: stdout chunk (state=3): >>><<< 44071 1727204640.75897: done transferring module to remote 44071 1727204640.75908: _low_level_execute_command(): starting 44071 1727204640.75913: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204640.6946332-47121-33818863694272/ /root/.ansible/tmp/ansible-tmp-1727204640.6946332-47121-33818863694272/AnsiballZ_stat.py && sleep 0' 44071 1727204640.76403: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.76407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.76410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204640.76412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204640.76414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.76468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204640.76479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204640.76558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204640.78381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204640.78445: stderr chunk (state=3): >>><<< 44071 1727204640.78449: stdout chunk (state=3): >>><<< 44071 1727204640.78462: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204640.78466: _low_level_execute_command(): starting 44071 1727204640.78472: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204640.6946332-47121-33818863694272/AnsiballZ_stat.py && sleep 0' 44071 1727204640.78972: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.78978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.78981: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.79000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.79043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204640.79047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204640.79049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204640.79132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204640.96028: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 37468, "dev": 23, "nlink": 1, "atime": 1727204631.7995505, "mtime": 1727204631.7995505, "ctime": 1727204631.7995505, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44071 1727204640.97489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204640.97556: stderr chunk (state=3): >>><<< 44071 1727204640.97559: stdout chunk (state=3): >>><<< 44071 1727204640.97577: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 37468, "dev": 23, "nlink": 1, "atime": 1727204631.7995505, "mtime": 1727204631.7995505, "ctime": 1727204631.7995505, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204640.97623: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204640.6946332-47121-33818863694272/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204640.97632: _low_level_execute_command(): starting 44071 1727204640.97639: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204640.6946332-47121-33818863694272/ > /dev/null 2>&1 && sleep 0' 44071 1727204640.98153: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.98158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.98161: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204640.98163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204640.98222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204640.98231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204640.98233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204640.98303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204641.00262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204641.00324: stderr chunk (state=3): >>><<< 44071 1727204641.00328: stdout chunk (state=3): >>><<< 44071 1727204641.00347: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204641.00350: handler run complete 44071 1727204641.00391: attempt loop complete, returning result 44071 1727204641.00394: _execute() done 44071 1727204641.00396: dumping result to json 44071 1727204641.00402: done dumping result, returning 44071 1727204641.00410: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [127b8e07-fff9-c964-7471-000000000ef5] 44071 1727204641.00414: sending task result for task 127b8e07-fff9-c964-7471-000000000ef5 44071 1727204641.00536: done sending task result for task 127b8e07-fff9-c964-7471-000000000ef5 44071 1727204641.00539: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204631.7995505, "block_size": 4096, "blocks": 0, "ctime": 1727204631.7995505, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 37468, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1727204631.7995505, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 44071 1727204641.00644: no more pending results, returning what we have 44071 1727204641.00648: results queue empty 44071 1727204641.00649: checking for any_errors_fatal 44071 1727204641.00651: done checking for any_errors_fatal 44071 1727204641.00652: checking for max_fail_percentage 44071 1727204641.00653: done checking for max_fail_percentage 44071 1727204641.00655: checking to see if all hosts have failed and the running result is not ok 44071 1727204641.00655: done checking to see if all hosts have failed 44071 1727204641.00656: getting the remaining hosts for this loop 44071 1727204641.00658: done getting the remaining hosts for this loop 44071 1727204641.00663: getting the next task for host managed-node2 44071 1727204641.00675: done getting next task for host managed-node2 44071 1727204641.00678: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 44071 1727204641.00682: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204641.00687: getting variables 44071 1727204641.00688: in VariableManager get_vars() 44071 1727204641.00720: Calling all_inventory to load vars for managed-node2 44071 1727204641.00723: Calling groups_inventory to load vars for managed-node2 44071 1727204641.00726: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204641.00739: Calling all_plugins_play to load vars for managed-node2 44071 1727204641.00744: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204641.00747: Calling groups_plugins_play to load vars for managed-node2 44071 1727204641.01794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204641.03058: done with get_vars() 44071 1727204641.03090: done getting variables 44071 1727204641.03140: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204641.03248: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:04:01 -0400 (0:00:00.382) 0:00:53.349 ***** 44071 1727204641.03278: entering _queue_task() for managed-node2/assert 44071 1727204641.03588: worker is 1 (out of 1 available) 44071 1727204641.03604: exiting _queue_task() for managed-node2/assert 44071 1727204641.03618: done queuing things up, now waiting for results queue to drain 44071 1727204641.03620: waiting for pending results... 44071 1727204641.03827: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'statebr' 44071 1727204641.03918: in run() - task 127b8e07-fff9-c964-7471-000000000e87 44071 1727204641.03931: variable 'ansible_search_path' from source: unknown 44071 1727204641.03935: variable 'ansible_search_path' from source: unknown 44071 1727204641.03976: calling self._execute() 44071 1727204641.04062: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204641.04068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204641.04082: variable 'omit' from source: magic vars 44071 1727204641.04411: variable 'ansible_distribution_major_version' from source: facts 44071 1727204641.04415: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204641.04473: variable 'omit' from source: magic vars 44071 1727204641.04477: variable 'omit' from source: magic vars 44071 1727204641.04548: variable 'interface' from source: play vars 44071 1727204641.04563: variable 'omit' from source: magic vars 44071 1727204641.04602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204641.04636: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204641.04655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204641.04673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204641.04684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204641.04710: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204641.04715: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204641.04717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204641.04799: Set connection var ansible_connection to ssh 44071 1727204641.04805: Set connection var ansible_timeout to 10 44071 1727204641.04811: Set connection var ansible_pipelining to False 44071 1727204641.04816: Set connection var ansible_shell_type to sh 44071 1727204641.04822: Set connection var ansible_shell_executable to /bin/sh 44071 1727204641.04830: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204641.04855: variable 'ansible_shell_executable' from source: unknown 44071 1727204641.04859: variable 'ansible_connection' from source: unknown 44071 1727204641.04861: variable 'ansible_module_compression' from source: unknown 44071 1727204641.04864: variable 'ansible_shell_type' from source: unknown 44071 1727204641.04873: variable 'ansible_shell_executable' from source: unknown 44071 1727204641.04875: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204641.04878: variable 'ansible_pipelining' from source: unknown 44071 1727204641.04880: variable 'ansible_timeout' from source: unknown 44071 1727204641.04882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204641.04998: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204641.05008: variable 'omit' from source: magic vars 44071 1727204641.05014: starting attempt loop 44071 1727204641.05017: running the handler 44071 1727204641.05129: variable 'interface_stat' from source: set_fact 44071 1727204641.05147: Evaluated conditional (interface_stat.stat.exists): True 44071 1727204641.05150: handler run complete 44071 1727204641.05166: attempt loop complete, returning result 44071 1727204641.05169: _execute() done 44071 1727204641.05172: dumping result to json 44071 1727204641.05176: done dumping result, returning 44071 1727204641.05184: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'statebr' [127b8e07-fff9-c964-7471-000000000e87] 44071 1727204641.05188: sending task result for task 127b8e07-fff9-c964-7471-000000000e87 44071 1727204641.05293: done sending task result for task 127b8e07-fff9-c964-7471-000000000e87 44071 1727204641.05296: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204641.05350: no more pending results, returning what we have 44071 1727204641.05354: results queue empty 44071 1727204641.05355: checking for any_errors_fatal 44071 1727204641.05367: done checking for any_errors_fatal 44071 1727204641.05368: checking for max_fail_percentage 44071 1727204641.05369: done checking for max_fail_percentage 44071 1727204641.05370: checking to see if all hosts have failed and the running result is not ok 44071 1727204641.05371: done checking to see if all hosts have failed 44071 1727204641.05372: getting the remaining hosts for this loop 44071 1727204641.05374: done getting the remaining hosts for this loop 44071 1727204641.05379: getting the next task for host managed-node2 44071 1727204641.05390: done getting next task for host managed-node2 44071 1727204641.05394: ^ task is: TASK: Include the task 'get_profile_stat.yml' 44071 1727204641.05398: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204641.05402: getting variables 44071 1727204641.05404: in VariableManager get_vars() 44071 1727204641.05440: Calling all_inventory to load vars for managed-node2 44071 1727204641.05445: Calling groups_inventory to load vars for managed-node2 44071 1727204641.05448: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204641.05460: Calling all_plugins_play to load vars for managed-node2 44071 1727204641.05463: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204641.05473: Calling groups_plugins_play to load vars for managed-node2 44071 1727204641.06634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204641.07845: done with get_vars() 44071 1727204641.07880: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 15:04:01 -0400 (0:00:00.046) 0:00:53.396 ***** 44071 1727204641.07960: entering _queue_task() for managed-node2/include_tasks 44071 1727204641.08262: worker is 1 (out of 1 available) 44071 1727204641.08279: exiting _queue_task() for managed-node2/include_tasks 44071 1727204641.08294: done queuing things up, now waiting for results queue to drain 44071 1727204641.08296: waiting for pending results... 44071 1727204641.08511: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 44071 1727204641.08604: in run() - task 127b8e07-fff9-c964-7471-000000000e8b 44071 1727204641.08620: variable 'ansible_search_path' from source: unknown 44071 1727204641.08625: variable 'ansible_search_path' from source: unknown 44071 1727204641.08660: calling self._execute() 44071 1727204641.08750: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204641.08757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204641.08768: variable 'omit' from source: magic vars 44071 1727204641.09106: variable 'ansible_distribution_major_version' from source: facts 44071 1727204641.09117: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204641.09123: _execute() done 44071 1727204641.09127: dumping result to json 44071 1727204641.09130: done dumping result, returning 44071 1727204641.09136: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-c964-7471-000000000e8b] 44071 1727204641.09143: sending task result for task 127b8e07-fff9-c964-7471-000000000e8b 44071 1727204641.09251: done sending task result for task 127b8e07-fff9-c964-7471-000000000e8b 44071 1727204641.09254: WORKER PROCESS EXITING 44071 1727204641.09290: no more pending results, returning what we have 44071 1727204641.09295: in VariableManager get_vars() 44071 1727204641.09337: Calling all_inventory to load vars for managed-node2 44071 1727204641.09340: Calling groups_inventory to load vars for managed-node2 44071 1727204641.09343: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204641.09359: Calling all_plugins_play to load vars for managed-node2 44071 1727204641.09362: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204641.09375: Calling groups_plugins_play to load vars for managed-node2 44071 1727204641.10549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204641.11761: done with get_vars() 44071 1727204641.11791: variable 'ansible_search_path' from source: unknown 44071 1727204641.11792: variable 'ansible_search_path' from source: unknown 44071 1727204641.11801: variable 'item' from source: include params 44071 1727204641.11892: variable 'item' from source: include params 44071 1727204641.11921: we have included files to process 44071 1727204641.11921: generating all_blocks data 44071 1727204641.11923: done generating all_blocks data 44071 1727204641.11926: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204641.11927: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204641.11928: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204641.12615: done processing included file 44071 1727204641.12617: iterating over new_blocks loaded from include file 44071 1727204641.12618: in VariableManager get_vars() 44071 1727204641.12634: done with get_vars() 44071 1727204641.12636: filtering new block on tags 44071 1727204641.12686: done filtering new block on tags 44071 1727204641.12688: in VariableManager get_vars() 44071 1727204641.12699: done with get_vars() 44071 1727204641.12700: filtering new block on tags 44071 1727204641.12738: done filtering new block on tags 44071 1727204641.12740: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 44071 1727204641.12745: extending task lists for all hosts with included blocks 44071 1727204641.13004: done extending task lists 44071 1727204641.13005: done processing included files 44071 1727204641.13006: results queue empty 44071 1727204641.13006: checking for any_errors_fatal 44071 1727204641.13009: done checking for any_errors_fatal 44071 1727204641.13009: checking for max_fail_percentage 44071 1727204641.13010: done checking for max_fail_percentage 44071 1727204641.13011: checking to see if all hosts have failed and the running result is not ok 44071 1727204641.13011: done checking to see if all hosts have failed 44071 1727204641.13012: getting the remaining hosts for this loop 44071 1727204641.13013: done getting the remaining hosts for this loop 44071 1727204641.13014: getting the next task for host managed-node2 44071 1727204641.13018: done getting next task for host managed-node2 44071 1727204641.13019: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 44071 1727204641.13022: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204641.13023: getting variables 44071 1727204641.13024: in VariableManager get_vars() 44071 1727204641.13032: Calling all_inventory to load vars for managed-node2 44071 1727204641.13033: Calling groups_inventory to load vars for managed-node2 44071 1727204641.13035: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204641.13040: Calling all_plugins_play to load vars for managed-node2 44071 1727204641.13042: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204641.13045: Calling groups_plugins_play to load vars for managed-node2 44071 1727204641.13915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204641.15212: done with get_vars() 44071 1727204641.15239: done getting variables 44071 1727204641.15280: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:04:01 -0400 (0:00:00.073) 0:00:53.469 ***** 44071 1727204641.15307: entering _queue_task() for managed-node2/set_fact 44071 1727204641.15617: worker is 1 (out of 1 available) 44071 1727204641.15632: exiting _queue_task() for managed-node2/set_fact 44071 1727204641.15647: done queuing things up, now waiting for results queue to drain 44071 1727204641.15649: waiting for pending results... 44071 1727204641.15865: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 44071 1727204641.15958: in run() - task 127b8e07-fff9-c964-7471-000000000f13 44071 1727204641.15973: variable 'ansible_search_path' from source: unknown 44071 1727204641.15978: variable 'ansible_search_path' from source: unknown 44071 1727204641.16014: calling self._execute() 44071 1727204641.16100: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204641.16107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204641.16116: variable 'omit' from source: magic vars 44071 1727204641.16442: variable 'ansible_distribution_major_version' from source: facts 44071 1727204641.16459: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204641.16463: variable 'omit' from source: magic vars 44071 1727204641.16507: variable 'omit' from source: magic vars 44071 1727204641.16536: variable 'omit' from source: magic vars 44071 1727204641.16578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204641.16609: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204641.16628: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204641.16648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204641.16659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204641.16688: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204641.16692: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204641.16695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204641.16774: Set connection var ansible_connection to ssh 44071 1727204641.16780: Set connection var ansible_timeout to 10 44071 1727204641.16789: Set connection var ansible_pipelining to False 44071 1727204641.16791: Set connection var ansible_shell_type to sh 44071 1727204641.16797: Set connection var ansible_shell_executable to /bin/sh 44071 1727204641.16804: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204641.16824: variable 'ansible_shell_executable' from source: unknown 44071 1727204641.16828: variable 'ansible_connection' from source: unknown 44071 1727204641.16832: variable 'ansible_module_compression' from source: unknown 44071 1727204641.16834: variable 'ansible_shell_type' from source: unknown 44071 1727204641.16837: variable 'ansible_shell_executable' from source: unknown 44071 1727204641.16839: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204641.16842: variable 'ansible_pipelining' from source: unknown 44071 1727204641.16847: variable 'ansible_timeout' from source: unknown 44071 1727204641.16851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204641.16973: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204641.16983: variable 'omit' from source: magic vars 44071 1727204641.16988: starting attempt loop 44071 1727204641.16992: running the handler 44071 1727204641.17006: handler run complete 44071 1727204641.17015: attempt loop complete, returning result 44071 1727204641.17018: _execute() done 44071 1727204641.17021: dumping result to json 44071 1727204641.17025: done dumping result, returning 44071 1727204641.17033: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-c964-7471-000000000f13] 44071 1727204641.17037: sending task result for task 127b8e07-fff9-c964-7471-000000000f13 44071 1727204641.17135: done sending task result for task 127b8e07-fff9-c964-7471-000000000f13 44071 1727204641.17138: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 44071 1727204641.17199: no more pending results, returning what we have 44071 1727204641.17203: results queue empty 44071 1727204641.17205: checking for any_errors_fatal 44071 1727204641.17207: done checking for any_errors_fatal 44071 1727204641.17207: checking for max_fail_percentage 44071 1727204641.17209: done checking for max_fail_percentage 44071 1727204641.17210: checking to see if all hosts have failed and the running result is not ok 44071 1727204641.17211: done checking to see if all hosts have failed 44071 1727204641.17211: getting the remaining hosts for this loop 44071 1727204641.17213: done getting the remaining hosts for this loop 44071 1727204641.17218: getting the next task for host managed-node2 44071 1727204641.17228: done getting next task for host managed-node2 44071 1727204641.17231: ^ task is: TASK: Stat profile file 44071 1727204641.17238: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204641.17242: getting variables 44071 1727204641.17243: in VariableManager get_vars() 44071 1727204641.17284: Calling all_inventory to load vars for managed-node2 44071 1727204641.17286: Calling groups_inventory to load vars for managed-node2 44071 1727204641.17290: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204641.17303: Calling all_plugins_play to load vars for managed-node2 44071 1727204641.17306: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204641.17309: Calling groups_plugins_play to load vars for managed-node2 44071 1727204641.18733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204641.20192: done with get_vars() 44071 1727204641.20225: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:04:01 -0400 (0:00:00.050) 0:00:53.519 ***** 44071 1727204641.20309: entering _queue_task() for managed-node2/stat 44071 1727204641.20618: worker is 1 (out of 1 available) 44071 1727204641.20636: exiting _queue_task() for managed-node2/stat 44071 1727204641.20650: done queuing things up, now waiting for results queue to drain 44071 1727204641.20652: waiting for pending results... 44071 1727204641.20862: running TaskExecutor() for managed-node2/TASK: Stat profile file 44071 1727204641.20968: in run() - task 127b8e07-fff9-c964-7471-000000000f14 44071 1727204641.20982: variable 'ansible_search_path' from source: unknown 44071 1727204641.20987: variable 'ansible_search_path' from source: unknown 44071 1727204641.21022: calling self._execute() 44071 1727204641.21116: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204641.21120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204641.21129: variable 'omit' from source: magic vars 44071 1727204641.21457: variable 'ansible_distribution_major_version' from source: facts 44071 1727204641.21470: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204641.21477: variable 'omit' from source: magic vars 44071 1727204641.21517: variable 'omit' from source: magic vars 44071 1727204641.21597: variable 'profile' from source: play vars 44071 1727204641.21601: variable 'interface' from source: play vars 44071 1727204641.21653: variable 'interface' from source: play vars 44071 1727204641.21677: variable 'omit' from source: magic vars 44071 1727204641.21971: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204641.21976: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204641.21980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204641.21983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204641.21986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204641.21990: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204641.21993: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204641.21996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204641.21999: Set connection var ansible_connection to ssh 44071 1727204641.22026: Set connection var ansible_timeout to 10 44071 1727204641.22039: Set connection var ansible_pipelining to False 44071 1727204641.22053: Set connection var ansible_shell_type to sh 44071 1727204641.22063: Set connection var ansible_shell_executable to /bin/sh 44071 1727204641.22089: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204641.22135: variable 'ansible_shell_executable' from source: unknown 44071 1727204641.22161: variable 'ansible_connection' from source: unknown 44071 1727204641.22171: variable 'ansible_module_compression' from source: unknown 44071 1727204641.22178: variable 'ansible_shell_type' from source: unknown 44071 1727204641.22186: variable 'ansible_shell_executable' from source: unknown 44071 1727204641.22193: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204641.22200: variable 'ansible_pipelining' from source: unknown 44071 1727204641.22299: variable 'ansible_timeout' from source: unknown 44071 1727204641.22380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204641.22555: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204641.22578: variable 'omit' from source: magic vars 44071 1727204641.22589: starting attempt loop 44071 1727204641.22596: running the handler 44071 1727204641.22617: _low_level_execute_command(): starting 44071 1727204641.22630: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204641.23405: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204641.23410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204641.23486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204641.23527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204641.23539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204641.23563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204641.23669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204641.25481: stdout chunk (state=3): >>>/root <<< 44071 1727204641.25588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204641.25699: stderr chunk (state=3): >>><<< 44071 1727204641.25708: stdout chunk (state=3): >>><<< 44071 1727204641.25748: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204641.25775: _low_level_execute_command(): starting 44071 1727204641.25788: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204641.2575743-47133-188615515416218 `" && echo ansible-tmp-1727204641.2575743-47133-188615515416218="` echo /root/.ansible/tmp/ansible-tmp-1727204641.2575743-47133-188615515416218 `" ) && sleep 0' 44071 1727204641.26501: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204641.26560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204641.26668: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204641.26700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204641.26721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204641.26747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204641.26883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204641.28917: stdout chunk (state=3): >>>ansible-tmp-1727204641.2575743-47133-188615515416218=/root/.ansible/tmp/ansible-tmp-1727204641.2575743-47133-188615515416218 <<< 44071 1727204641.29276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204641.29281: stdout chunk (state=3): >>><<< 44071 1727204641.29283: stderr chunk (state=3): >>><<< 44071 1727204641.29286: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204641.2575743-47133-188615515416218=/root/.ansible/tmp/ansible-tmp-1727204641.2575743-47133-188615515416218 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204641.29288: variable 'ansible_module_compression' from source: unknown 44071 1727204641.29291: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44071 1727204641.29338: variable 'ansible_facts' from source: unknown 44071 1727204641.29447: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204641.2575743-47133-188615515416218/AnsiballZ_stat.py 44071 1727204641.29673: Sending initial data 44071 1727204641.29682: Sent initial data (153 bytes) 44071 1727204641.30449: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204641.30487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204641.30595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204641.30647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204641.30720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204641.32364: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204641.32378: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 44071 1727204641.32425: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 44071 1727204641.32429: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 44071 1727204641.32433: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204641.32506: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204641.32589: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmplmkcvzef /root/.ansible/tmp/ansible-tmp-1727204641.2575743-47133-188615515416218/AnsiballZ_stat.py <<< 44071 1727204641.32592: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204641.2575743-47133-188615515416218/AnsiballZ_stat.py" <<< 44071 1727204641.32770: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmplmkcvzef" to remote "/root/.ansible/tmp/ansible-tmp-1727204641.2575743-47133-188615515416218/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204641.2575743-47133-188615515416218/AnsiballZ_stat.py" <<< 44071 1727204641.34285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204641.34359: stderr chunk (state=3): >>><<< 44071 1727204641.34362: stdout chunk (state=3): >>><<< 44071 1727204641.34366: done transferring module to remote 44071 1727204641.34369: _low_level_execute_command(): starting 44071 1727204641.34496: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204641.2575743-47133-188615515416218/ /root/.ansible/tmp/ansible-tmp-1727204641.2575743-47133-188615515416218/AnsiballZ_stat.py && sleep 0' 44071 1727204641.35517: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204641.35602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204641.35671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204641.35704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204641.35730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204641.35842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204641.37896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204641.37997: stderr chunk (state=3): >>><<< 44071 1727204641.38023: stdout chunk (state=3): >>><<< 44071 1727204641.38278: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204641.38282: _low_level_execute_command(): starting 44071 1727204641.38285: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204641.2575743-47133-188615515416218/AnsiballZ_stat.py && sleep 0' 44071 1727204641.39381: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204641.39388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204641.39414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204641.39418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204641.39592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204641.39637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204641.56485: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44071 1727204641.58081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204641.58085: stdout chunk (state=3): >>><<< 44071 1727204641.58088: stderr chunk (state=3): >>><<< 44071 1727204641.58091: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204641.58094: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204641.2575743-47133-188615515416218/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204641.58107: _low_level_execute_command(): starting 44071 1727204641.58117: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204641.2575743-47133-188615515416218/ > /dev/null 2>&1 && sleep 0' 44071 1727204641.58826: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204641.58849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204641.58947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204641.58983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204641.59010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204641.59034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204641.59156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204641.61209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204641.61232: stdout chunk (state=3): >>><<< 44071 1727204641.61248: stderr chunk (state=3): >>><<< 44071 1727204641.61473: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204641.61477: handler run complete 44071 1727204641.61479: attempt loop complete, returning result 44071 1727204641.61482: _execute() done 44071 1727204641.61484: dumping result to json 44071 1727204641.61486: done dumping result, returning 44071 1727204641.61488: done running TaskExecutor() for managed-node2/TASK: Stat profile file [127b8e07-fff9-c964-7471-000000000f14] 44071 1727204641.61490: sending task result for task 127b8e07-fff9-c964-7471-000000000f14 44071 1727204641.61575: done sending task result for task 127b8e07-fff9-c964-7471-000000000f14 44071 1727204641.61579: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 44071 1727204641.61652: no more pending results, returning what we have 44071 1727204641.61655: results queue empty 44071 1727204641.61656: checking for any_errors_fatal 44071 1727204641.61666: done checking for any_errors_fatal 44071 1727204641.61667: checking for max_fail_percentage 44071 1727204641.61669: done checking for max_fail_percentage 44071 1727204641.61670: checking to see if all hosts have failed and the running result is not ok 44071 1727204641.61671: done checking to see if all hosts have failed 44071 1727204641.61672: getting the remaining hosts for this loop 44071 1727204641.61674: done getting the remaining hosts for this loop 44071 1727204641.61680: getting the next task for host managed-node2 44071 1727204641.61691: done getting next task for host managed-node2 44071 1727204641.61694: ^ task is: TASK: Set NM profile exist flag based on the profile files 44071 1727204641.61701: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204641.61706: getting variables 44071 1727204641.61708: in VariableManager get_vars() 44071 1727204641.61750: Calling all_inventory to load vars for managed-node2 44071 1727204641.61753: Calling groups_inventory to load vars for managed-node2 44071 1727204641.61757: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204641.61886: Calling all_plugins_play to load vars for managed-node2 44071 1727204641.61891: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204641.61895: Calling groups_plugins_play to load vars for managed-node2 44071 1727204641.64307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204641.66744: done with get_vars() 44071 1727204641.66796: done getting variables 44071 1727204641.66870: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:04:01 -0400 (0:00:00.466) 0:00:53.985 ***** 44071 1727204641.66916: entering _queue_task() for managed-node2/set_fact 44071 1727204641.67440: worker is 1 (out of 1 available) 44071 1727204641.67456: exiting _queue_task() for managed-node2/set_fact 44071 1727204641.67550: done queuing things up, now waiting for results queue to drain 44071 1727204641.67552: waiting for pending results... 44071 1727204641.67892: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 44071 1727204641.67917: in run() - task 127b8e07-fff9-c964-7471-000000000f15 44071 1727204641.67944: variable 'ansible_search_path' from source: unknown 44071 1727204641.67955: variable 'ansible_search_path' from source: unknown 44071 1727204641.68016: calling self._execute() 44071 1727204641.68207: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204641.68211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204641.68216: variable 'omit' from source: magic vars 44071 1727204641.68669: variable 'ansible_distribution_major_version' from source: facts 44071 1727204641.68691: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204641.68845: variable 'profile_stat' from source: set_fact 44071 1727204641.68877: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204641.68886: when evaluation is False, skipping this task 44071 1727204641.68893: _execute() done 44071 1727204641.68901: dumping result to json 44071 1727204641.68972: done dumping result, returning 44071 1727204641.68980: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-c964-7471-000000000f15] 44071 1727204641.68983: sending task result for task 127b8e07-fff9-c964-7471-000000000f15 44071 1727204641.69064: done sending task result for task 127b8e07-fff9-c964-7471-000000000f15 44071 1727204641.69172: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204641.69241: no more pending results, returning what we have 44071 1727204641.69248: results queue empty 44071 1727204641.69249: checking for any_errors_fatal 44071 1727204641.69263: done checking for any_errors_fatal 44071 1727204641.69264: checking for max_fail_percentage 44071 1727204641.69268: done checking for max_fail_percentage 44071 1727204641.69269: checking to see if all hosts have failed and the running result is not ok 44071 1727204641.69270: done checking to see if all hosts have failed 44071 1727204641.69271: getting the remaining hosts for this loop 44071 1727204641.69272: done getting the remaining hosts for this loop 44071 1727204641.69278: getting the next task for host managed-node2 44071 1727204641.69287: done getting next task for host managed-node2 44071 1727204641.69290: ^ task is: TASK: Get NM profile info 44071 1727204641.69475: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204641.69481: getting variables 44071 1727204641.69483: in VariableManager get_vars() 44071 1727204641.69516: Calling all_inventory to load vars for managed-node2 44071 1727204641.69519: Calling groups_inventory to load vars for managed-node2 44071 1727204641.69523: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204641.69535: Calling all_plugins_play to load vars for managed-node2 44071 1727204641.69537: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204641.69541: Calling groups_plugins_play to load vars for managed-node2 44071 1727204641.71583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204641.74254: done with get_vars() 44071 1727204641.74290: done getting variables 44071 1727204641.74374: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:04:01 -0400 (0:00:00.074) 0:00:54.060 ***** 44071 1727204641.74416: entering _queue_task() for managed-node2/shell 44071 1727204641.75023: worker is 1 (out of 1 available) 44071 1727204641.75038: exiting _queue_task() for managed-node2/shell 44071 1727204641.75053: done queuing things up, now waiting for results queue to drain 44071 1727204641.75055: waiting for pending results... 44071 1727204641.75359: running TaskExecutor() for managed-node2/TASK: Get NM profile info 44071 1727204641.75440: in run() - task 127b8e07-fff9-c964-7471-000000000f16 44071 1727204641.75481: variable 'ansible_search_path' from source: unknown 44071 1727204641.75494: variable 'ansible_search_path' from source: unknown 44071 1727204641.75570: calling self._execute() 44071 1727204641.75673: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204641.75690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204641.75710: variable 'omit' from source: magic vars 44071 1727204641.76223: variable 'ansible_distribution_major_version' from source: facts 44071 1727204641.76227: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204641.76240: variable 'omit' from source: magic vars 44071 1727204641.76330: variable 'omit' from source: magic vars 44071 1727204641.76461: variable 'profile' from source: play vars 44071 1727204641.76475: variable 'interface' from source: play vars 44071 1727204641.76561: variable 'interface' from source: play vars 44071 1727204641.76590: variable 'omit' from source: magic vars 44071 1727204641.76645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204641.76764: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204641.76768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204641.76772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204641.76774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204641.76871: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204641.76875: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204641.76877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204641.76969: Set connection var ansible_connection to ssh 44071 1727204641.76984: Set connection var ansible_timeout to 10 44071 1727204641.77005: Set connection var ansible_pipelining to False 44071 1727204641.77017: Set connection var ansible_shell_type to sh 44071 1727204641.77028: Set connection var ansible_shell_executable to /bin/sh 44071 1727204641.77041: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204641.77077: variable 'ansible_shell_executable' from source: unknown 44071 1727204641.77111: variable 'ansible_connection' from source: unknown 44071 1727204641.77114: variable 'ansible_module_compression' from source: unknown 44071 1727204641.77116: variable 'ansible_shell_type' from source: unknown 44071 1727204641.77119: variable 'ansible_shell_executable' from source: unknown 44071 1727204641.77121: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204641.77219: variable 'ansible_pipelining' from source: unknown 44071 1727204641.77222: variable 'ansible_timeout' from source: unknown 44071 1727204641.77226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204641.77330: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204641.77357: variable 'omit' from source: magic vars 44071 1727204641.77369: starting attempt loop 44071 1727204641.77377: running the handler 44071 1727204641.77392: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204641.77423: _low_level_execute_command(): starting 44071 1727204641.77452: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204641.78396: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204641.78493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204641.78546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204641.78627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204641.80404: stdout chunk (state=3): >>>/root <<< 44071 1727204641.80622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204641.80627: stdout chunk (state=3): >>><<< 44071 1727204641.80629: stderr chunk (state=3): >>><<< 44071 1727204641.80659: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204641.80771: _low_level_execute_command(): starting 44071 1727204641.80776: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204641.8066852-47154-274401982810638 `" && echo ansible-tmp-1727204641.8066852-47154-274401982810638="` echo /root/.ansible/tmp/ansible-tmp-1727204641.8066852-47154-274401982810638 `" ) && sleep 0' 44071 1727204641.81451: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204641.81470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204641.81571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204641.81614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204641.81636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204641.81659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204641.81798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204641.83806: stdout chunk (state=3): >>>ansible-tmp-1727204641.8066852-47154-274401982810638=/root/.ansible/tmp/ansible-tmp-1727204641.8066852-47154-274401982810638 <<< 44071 1727204641.84026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204641.84031: stdout chunk (state=3): >>><<< 44071 1727204641.84033: stderr chunk (state=3): >>><<< 44071 1727204641.84151: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204641.8066852-47154-274401982810638=/root/.ansible/tmp/ansible-tmp-1727204641.8066852-47154-274401982810638 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204641.84155: variable 'ansible_module_compression' from source: unknown 44071 1727204641.84177: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204641.84219: variable 'ansible_facts' from source: unknown 44071 1727204641.84317: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204641.8066852-47154-274401982810638/AnsiballZ_command.py 44071 1727204641.84506: Sending initial data 44071 1727204641.84517: Sent initial data (156 bytes) 44071 1727204641.85293: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204641.85313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204641.85421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204641.85449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204641.85469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204641.85494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204641.85610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204641.87232: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204641.87329: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204641.87456: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpa462ptsl /root/.ansible/tmp/ansible-tmp-1727204641.8066852-47154-274401982810638/AnsiballZ_command.py <<< 44071 1727204641.87460: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204641.8066852-47154-274401982810638/AnsiballZ_command.py" <<< 44071 1727204641.87527: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpa462ptsl" to remote "/root/.ansible/tmp/ansible-tmp-1727204641.8066852-47154-274401982810638/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204641.8066852-47154-274401982810638/AnsiballZ_command.py" <<< 44071 1727204641.88418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204641.88469: stderr chunk (state=3): >>><<< 44071 1727204641.88473: stdout chunk (state=3): >>><<< 44071 1727204641.88494: done transferring module to remote 44071 1727204641.88505: _low_level_execute_command(): starting 44071 1727204641.88511: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204641.8066852-47154-274401982810638/ /root/.ansible/tmp/ansible-tmp-1727204641.8066852-47154-274401982810638/AnsiballZ_command.py && sleep 0' 44071 1727204641.89026: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204641.89032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204641.89039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204641.89089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204641.89092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204641.89095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204641.89177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204641.91244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204641.91368: stderr chunk (state=3): >>><<< 44071 1727204641.91372: stdout chunk (state=3): >>><<< 44071 1727204641.91375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204641.91378: _low_level_execute_command(): starting 44071 1727204641.91402: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204641.8066852-47154-274401982810638/AnsiballZ_command.py && sleep 0' 44071 1727204641.92146: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204641.92151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204641.92211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204641.92215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204641.92259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204641.92347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204642.11012: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:04:02.090227", "end": "2024-09-24 15:04:02.108582", "delta": "0:00:00.018355", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204642.12622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204642.12687: stderr chunk (state=3): >>><<< 44071 1727204642.12691: stdout chunk (state=3): >>><<< 44071 1727204642.12710: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:04:02.090227", "end": "2024-09-24 15:04:02.108582", "delta": "0:00:00.018355", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204642.12744: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204641.8066852-47154-274401982810638/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204642.12753: _low_level_execute_command(): starting 44071 1727204642.12758: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204641.8066852-47154-274401982810638/ > /dev/null 2>&1 && sleep 0' 44071 1727204642.13237: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204642.13247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204642.13280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204642.13283: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204642.13286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204642.13288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204642.13345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204642.13349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204642.13351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204642.13430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204642.15333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204642.15395: stderr chunk (state=3): >>><<< 44071 1727204642.15399: stdout chunk (state=3): >>><<< 44071 1727204642.15413: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204642.15420: handler run complete 44071 1727204642.15440: Evaluated conditional (False): False 44071 1727204642.15456: attempt loop complete, returning result 44071 1727204642.15459: _execute() done 44071 1727204642.15462: dumping result to json 44071 1727204642.15468: done dumping result, returning 44071 1727204642.15477: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [127b8e07-fff9-c964-7471-000000000f16] 44071 1727204642.15481: sending task result for task 127b8e07-fff9-c964-7471-000000000f16 44071 1727204642.15587: done sending task result for task 127b8e07-fff9-c964-7471-000000000f16 44071 1727204642.15591: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.018355", "end": "2024-09-24 15:04:02.108582", "rc": 0, "start": "2024-09-24 15:04:02.090227" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 44071 1727204642.15668: no more pending results, returning what we have 44071 1727204642.15671: results queue empty 44071 1727204642.15672: checking for any_errors_fatal 44071 1727204642.15678: done checking for any_errors_fatal 44071 1727204642.15678: checking for max_fail_percentage 44071 1727204642.15680: done checking for max_fail_percentage 44071 1727204642.15681: checking to see if all hosts have failed and the running result is not ok 44071 1727204642.15682: done checking to see if all hosts have failed 44071 1727204642.15682: getting the remaining hosts for this loop 44071 1727204642.15684: done getting the remaining hosts for this loop 44071 1727204642.15689: getting the next task for host managed-node2 44071 1727204642.15697: done getting next task for host managed-node2 44071 1727204642.15700: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44071 1727204642.15705: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204642.15708: getting variables 44071 1727204642.15710: in VariableManager get_vars() 44071 1727204642.15747: Calling all_inventory to load vars for managed-node2 44071 1727204642.15750: Calling groups_inventory to load vars for managed-node2 44071 1727204642.15753: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204642.15772: Calling all_plugins_play to load vars for managed-node2 44071 1727204642.15775: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204642.15779: Calling groups_plugins_play to load vars for managed-node2 44071 1727204642.16822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204642.18028: done with get_vars() 44071 1727204642.18061: done getting variables 44071 1727204642.18113: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.437) 0:00:54.497 ***** 44071 1727204642.18139: entering _queue_task() for managed-node2/set_fact 44071 1727204642.18439: worker is 1 (out of 1 available) 44071 1727204642.18456: exiting _queue_task() for managed-node2/set_fact 44071 1727204642.18473: done queuing things up, now waiting for results queue to drain 44071 1727204642.18475: waiting for pending results... 44071 1727204642.18684: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44071 1727204642.18786: in run() - task 127b8e07-fff9-c964-7471-000000000f17 44071 1727204642.18800: variable 'ansible_search_path' from source: unknown 44071 1727204642.18803: variable 'ansible_search_path' from source: unknown 44071 1727204642.18840: calling self._execute() 44071 1727204642.18924: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.18929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.18939: variable 'omit' from source: magic vars 44071 1727204642.19252: variable 'ansible_distribution_major_version' from source: facts 44071 1727204642.19264: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204642.19367: variable 'nm_profile_exists' from source: set_fact 44071 1727204642.19380: Evaluated conditional (nm_profile_exists.rc == 0): True 44071 1727204642.19383: variable 'omit' from source: magic vars 44071 1727204642.19432: variable 'omit' from source: magic vars 44071 1727204642.19469: variable 'omit' from source: magic vars 44071 1727204642.19505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204642.19535: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204642.19556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204642.19575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204642.19589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204642.19615: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204642.19619: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.19622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.19705: Set connection var ansible_connection to ssh 44071 1727204642.19711: Set connection var ansible_timeout to 10 44071 1727204642.19717: Set connection var ansible_pipelining to False 44071 1727204642.19722: Set connection var ansible_shell_type to sh 44071 1727204642.19728: Set connection var ansible_shell_executable to /bin/sh 44071 1727204642.19735: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204642.19757: variable 'ansible_shell_executable' from source: unknown 44071 1727204642.19760: variable 'ansible_connection' from source: unknown 44071 1727204642.19762: variable 'ansible_module_compression' from source: unknown 44071 1727204642.19767: variable 'ansible_shell_type' from source: unknown 44071 1727204642.19769: variable 'ansible_shell_executable' from source: unknown 44071 1727204642.19772: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.19777: variable 'ansible_pipelining' from source: unknown 44071 1727204642.19779: variable 'ansible_timeout' from source: unknown 44071 1727204642.19784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.19904: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204642.19909: variable 'omit' from source: magic vars 44071 1727204642.19915: starting attempt loop 44071 1727204642.19918: running the handler 44071 1727204642.19931: handler run complete 44071 1727204642.19939: attempt loop complete, returning result 44071 1727204642.19942: _execute() done 44071 1727204642.19948: dumping result to json 44071 1727204642.19950: done dumping result, returning 44071 1727204642.19959: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-c964-7471-000000000f17] 44071 1727204642.19962: sending task result for task 127b8e07-fff9-c964-7471-000000000f17 44071 1727204642.20059: done sending task result for task 127b8e07-fff9-c964-7471-000000000f17 44071 1727204642.20063: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 44071 1727204642.20126: no more pending results, returning what we have 44071 1727204642.20129: results queue empty 44071 1727204642.20130: checking for any_errors_fatal 44071 1727204642.20139: done checking for any_errors_fatal 44071 1727204642.20140: checking for max_fail_percentage 44071 1727204642.20142: done checking for max_fail_percentage 44071 1727204642.20143: checking to see if all hosts have failed and the running result is not ok 44071 1727204642.20143: done checking to see if all hosts have failed 44071 1727204642.20144: getting the remaining hosts for this loop 44071 1727204642.20146: done getting the remaining hosts for this loop 44071 1727204642.20151: getting the next task for host managed-node2 44071 1727204642.20162: done getting next task for host managed-node2 44071 1727204642.20164: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 44071 1727204642.20175: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204642.20180: getting variables 44071 1727204642.20182: in VariableManager get_vars() 44071 1727204642.20217: Calling all_inventory to load vars for managed-node2 44071 1727204642.20220: Calling groups_inventory to load vars for managed-node2 44071 1727204642.20224: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204642.20235: Calling all_plugins_play to load vars for managed-node2 44071 1727204642.20238: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204642.20241: Calling groups_plugins_play to load vars for managed-node2 44071 1727204642.21451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204642.22658: done with get_vars() 44071 1727204642.22692: done getting variables 44071 1727204642.22743: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204642.22847: variable 'profile' from source: play vars 44071 1727204642.22851: variable 'interface' from source: play vars 44071 1727204642.22896: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.047) 0:00:54.545 ***** 44071 1727204642.22921: entering _queue_task() for managed-node2/command 44071 1727204642.23216: worker is 1 (out of 1 available) 44071 1727204642.23233: exiting _queue_task() for managed-node2/command 44071 1727204642.23248: done queuing things up, now waiting for results queue to drain 44071 1727204642.23250: waiting for pending results... 44071 1727204642.23455: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr 44071 1727204642.23543: in run() - task 127b8e07-fff9-c964-7471-000000000f19 44071 1727204642.23558: variable 'ansible_search_path' from source: unknown 44071 1727204642.23562: variable 'ansible_search_path' from source: unknown 44071 1727204642.23600: calling self._execute() 44071 1727204642.23687: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.23693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.23707: variable 'omit' from source: magic vars 44071 1727204642.24015: variable 'ansible_distribution_major_version' from source: facts 44071 1727204642.24029: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204642.24120: variable 'profile_stat' from source: set_fact 44071 1727204642.24130: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204642.24133: when evaluation is False, skipping this task 44071 1727204642.24136: _execute() done 44071 1727204642.24141: dumping result to json 44071 1727204642.24146: done dumping result, returning 44071 1727204642.24149: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr [127b8e07-fff9-c964-7471-000000000f19] 44071 1727204642.24157: sending task result for task 127b8e07-fff9-c964-7471-000000000f19 44071 1727204642.24252: done sending task result for task 127b8e07-fff9-c964-7471-000000000f19 44071 1727204642.24255: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204642.24318: no more pending results, returning what we have 44071 1727204642.24322: results queue empty 44071 1727204642.24323: checking for any_errors_fatal 44071 1727204642.24333: done checking for any_errors_fatal 44071 1727204642.24334: checking for max_fail_percentage 44071 1727204642.24335: done checking for max_fail_percentage 44071 1727204642.24336: checking to see if all hosts have failed and the running result is not ok 44071 1727204642.24337: done checking to see if all hosts have failed 44071 1727204642.24338: getting the remaining hosts for this loop 44071 1727204642.24340: done getting the remaining hosts for this loop 44071 1727204642.24347: getting the next task for host managed-node2 44071 1727204642.24355: done getting next task for host managed-node2 44071 1727204642.24358: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 44071 1727204642.24363: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204642.24374: getting variables 44071 1727204642.24376: in VariableManager get_vars() 44071 1727204642.24413: Calling all_inventory to load vars for managed-node2 44071 1727204642.24416: Calling groups_inventory to load vars for managed-node2 44071 1727204642.24420: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204642.24433: Calling all_plugins_play to load vars for managed-node2 44071 1727204642.24436: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204642.24439: Calling groups_plugins_play to load vars for managed-node2 44071 1727204642.25497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204642.26852: done with get_vars() 44071 1727204642.26878: done getting variables 44071 1727204642.26927: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204642.27021: variable 'profile' from source: play vars 44071 1727204642.27025: variable 'interface' from source: play vars 44071 1727204642.27071: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.041) 0:00:54.587 ***** 44071 1727204642.27099: entering _queue_task() for managed-node2/set_fact 44071 1727204642.27396: worker is 1 (out of 1 available) 44071 1727204642.27412: exiting _queue_task() for managed-node2/set_fact 44071 1727204642.27424: done queuing things up, now waiting for results queue to drain 44071 1727204642.27426: waiting for pending results... 44071 1727204642.27631: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 44071 1727204642.27735: in run() - task 127b8e07-fff9-c964-7471-000000000f1a 44071 1727204642.27748: variable 'ansible_search_path' from source: unknown 44071 1727204642.27752: variable 'ansible_search_path' from source: unknown 44071 1727204642.27790: calling self._execute() 44071 1727204642.27869: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.27873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.27889: variable 'omit' from source: magic vars 44071 1727204642.28187: variable 'ansible_distribution_major_version' from source: facts 44071 1727204642.28198: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204642.28294: variable 'profile_stat' from source: set_fact 44071 1727204642.28304: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204642.28307: when evaluation is False, skipping this task 44071 1727204642.28310: _execute() done 44071 1727204642.28314: dumping result to json 44071 1727204642.28317: done dumping result, returning 44071 1727204642.28328: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [127b8e07-fff9-c964-7471-000000000f1a] 44071 1727204642.28331: sending task result for task 127b8e07-fff9-c964-7471-000000000f1a 44071 1727204642.28437: done sending task result for task 127b8e07-fff9-c964-7471-000000000f1a 44071 1727204642.28441: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204642.28496: no more pending results, returning what we have 44071 1727204642.28500: results queue empty 44071 1727204642.28501: checking for any_errors_fatal 44071 1727204642.28508: done checking for any_errors_fatal 44071 1727204642.28509: checking for max_fail_percentage 44071 1727204642.28511: done checking for max_fail_percentage 44071 1727204642.28512: checking to see if all hosts have failed and the running result is not ok 44071 1727204642.28512: done checking to see if all hosts have failed 44071 1727204642.28513: getting the remaining hosts for this loop 44071 1727204642.28515: done getting the remaining hosts for this loop 44071 1727204642.28519: getting the next task for host managed-node2 44071 1727204642.28529: done getting next task for host managed-node2 44071 1727204642.28532: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 44071 1727204642.28539: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204642.28545: getting variables 44071 1727204642.28547: in VariableManager get_vars() 44071 1727204642.28585: Calling all_inventory to load vars for managed-node2 44071 1727204642.28588: Calling groups_inventory to load vars for managed-node2 44071 1727204642.28592: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204642.28605: Calling all_plugins_play to load vars for managed-node2 44071 1727204642.28608: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204642.28610: Calling groups_plugins_play to load vars for managed-node2 44071 1727204642.29675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204642.30899: done with get_vars() 44071 1727204642.30930: done getting variables 44071 1727204642.30986: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204642.31083: variable 'profile' from source: play vars 44071 1727204642.31086: variable 'interface' from source: play vars 44071 1727204642.31128: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.040) 0:00:54.628 ***** 44071 1727204642.31157: entering _queue_task() for managed-node2/command 44071 1727204642.31458: worker is 1 (out of 1 available) 44071 1727204642.31475: exiting _queue_task() for managed-node2/command 44071 1727204642.31490: done queuing things up, now waiting for results queue to drain 44071 1727204642.31492: waiting for pending results... 44071 1727204642.31694: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr 44071 1727204642.31797: in run() - task 127b8e07-fff9-c964-7471-000000000f1b 44071 1727204642.31810: variable 'ansible_search_path' from source: unknown 44071 1727204642.31814: variable 'ansible_search_path' from source: unknown 44071 1727204642.31849: calling self._execute() 44071 1727204642.31928: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.31934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.31950: variable 'omit' from source: magic vars 44071 1727204642.32251: variable 'ansible_distribution_major_version' from source: facts 44071 1727204642.32264: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204642.32359: variable 'profile_stat' from source: set_fact 44071 1727204642.32370: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204642.32373: when evaluation is False, skipping this task 44071 1727204642.32378: _execute() done 44071 1727204642.32381: dumping result to json 44071 1727204642.32384: done dumping result, returning 44071 1727204642.32394: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr [127b8e07-fff9-c964-7471-000000000f1b] 44071 1727204642.32397: sending task result for task 127b8e07-fff9-c964-7471-000000000f1b skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204642.32558: no more pending results, returning what we have 44071 1727204642.32563: results queue empty 44071 1727204642.32564: checking for any_errors_fatal 44071 1727204642.32575: done checking for any_errors_fatal 44071 1727204642.32575: checking for max_fail_percentage 44071 1727204642.32577: done checking for max_fail_percentage 44071 1727204642.32578: checking to see if all hosts have failed and the running result is not ok 44071 1727204642.32579: done checking to see if all hosts have failed 44071 1727204642.32579: getting the remaining hosts for this loop 44071 1727204642.32581: done getting the remaining hosts for this loop 44071 1727204642.32586: getting the next task for host managed-node2 44071 1727204642.32594: done getting next task for host managed-node2 44071 1727204642.32597: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 44071 1727204642.32603: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204642.32607: getting variables 44071 1727204642.32609: in VariableManager get_vars() 44071 1727204642.32647: Calling all_inventory to load vars for managed-node2 44071 1727204642.32650: Calling groups_inventory to load vars for managed-node2 44071 1727204642.32653: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204642.32676: Calling all_plugins_play to load vars for managed-node2 44071 1727204642.32680: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204642.32686: done sending task result for task 127b8e07-fff9-c964-7471-000000000f1b 44071 1727204642.32689: WORKER PROCESS EXITING 44071 1727204642.32693: Calling groups_plugins_play to load vars for managed-node2 44071 1727204642.33907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204642.35105: done with get_vars() 44071 1727204642.35133: done getting variables 44071 1727204642.35189: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204642.35282: variable 'profile' from source: play vars 44071 1727204642.35286: variable 'interface' from source: play vars 44071 1727204642.35327: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.041) 0:00:54.670 ***** 44071 1727204642.35355: entering _queue_task() for managed-node2/set_fact 44071 1727204642.35657: worker is 1 (out of 1 available) 44071 1727204642.35676: exiting _queue_task() for managed-node2/set_fact 44071 1727204642.35691: done queuing things up, now waiting for results queue to drain 44071 1727204642.35693: waiting for pending results... 44071 1727204642.35899: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr 44071 1727204642.36006: in run() - task 127b8e07-fff9-c964-7471-000000000f1c 44071 1727204642.36017: variable 'ansible_search_path' from source: unknown 44071 1727204642.36023: variable 'ansible_search_path' from source: unknown 44071 1727204642.36060: calling self._execute() 44071 1727204642.36147: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.36151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.36158: variable 'omit' from source: magic vars 44071 1727204642.36452: variable 'ansible_distribution_major_version' from source: facts 44071 1727204642.36465: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204642.36560: variable 'profile_stat' from source: set_fact 44071 1727204642.36575: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204642.36578: when evaluation is False, skipping this task 44071 1727204642.36581: _execute() done 44071 1727204642.36587: dumping result to json 44071 1727204642.36589: done dumping result, returning 44071 1727204642.36598: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr [127b8e07-fff9-c964-7471-000000000f1c] 44071 1727204642.36601: sending task result for task 127b8e07-fff9-c964-7471-000000000f1c 44071 1727204642.36703: done sending task result for task 127b8e07-fff9-c964-7471-000000000f1c 44071 1727204642.36706: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204642.36757: no more pending results, returning what we have 44071 1727204642.36762: results queue empty 44071 1727204642.36763: checking for any_errors_fatal 44071 1727204642.36774: done checking for any_errors_fatal 44071 1727204642.36775: checking for max_fail_percentage 44071 1727204642.36776: done checking for max_fail_percentage 44071 1727204642.36777: checking to see if all hosts have failed and the running result is not ok 44071 1727204642.36778: done checking to see if all hosts have failed 44071 1727204642.36779: getting the remaining hosts for this loop 44071 1727204642.36780: done getting the remaining hosts for this loop 44071 1727204642.36785: getting the next task for host managed-node2 44071 1727204642.36795: done getting next task for host managed-node2 44071 1727204642.36799: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 44071 1727204642.36803: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204642.36809: getting variables 44071 1727204642.36811: in VariableManager get_vars() 44071 1727204642.36849: Calling all_inventory to load vars for managed-node2 44071 1727204642.36852: Calling groups_inventory to load vars for managed-node2 44071 1727204642.36856: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204642.36875: Calling all_plugins_play to load vars for managed-node2 44071 1727204642.36878: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204642.36882: Calling groups_plugins_play to load vars for managed-node2 44071 1727204642.37924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204642.39233: done with get_vars() 44071 1727204642.39261: done getting variables 44071 1727204642.39312: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204642.39408: variable 'profile' from source: play vars 44071 1727204642.39412: variable 'interface' from source: play vars 44071 1727204642.39454: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.041) 0:00:54.711 ***** 44071 1727204642.39484: entering _queue_task() for managed-node2/assert 44071 1727204642.39863: worker is 1 (out of 1 available) 44071 1727204642.40081: exiting _queue_task() for managed-node2/assert 44071 1727204642.40094: done queuing things up, now waiting for results queue to drain 44071 1727204642.40096: waiting for pending results... 44071 1727204642.40288: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'statebr' 44071 1727204642.40408: in run() - task 127b8e07-fff9-c964-7471-000000000e8c 44071 1727204642.40473: variable 'ansible_search_path' from source: unknown 44071 1727204642.40478: variable 'ansible_search_path' from source: unknown 44071 1727204642.40506: calling self._execute() 44071 1727204642.40628: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.40651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.40761: variable 'omit' from source: magic vars 44071 1727204642.41112: variable 'ansible_distribution_major_version' from source: facts 44071 1727204642.41135: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204642.41150: variable 'omit' from source: magic vars 44071 1727204642.41227: variable 'omit' from source: magic vars 44071 1727204642.41347: variable 'profile' from source: play vars 44071 1727204642.41357: variable 'interface' from source: play vars 44071 1727204642.41433: variable 'interface' from source: play vars 44071 1727204642.41461: variable 'omit' from source: magic vars 44071 1727204642.41519: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204642.41563: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204642.41595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204642.41623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204642.41647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204642.41740: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204642.41743: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.41746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.41820: Set connection var ansible_connection to ssh 44071 1727204642.41834: Set connection var ansible_timeout to 10 44071 1727204642.41850: Set connection var ansible_pipelining to False 44071 1727204642.41862: Set connection var ansible_shell_type to sh 44071 1727204642.41875: Set connection var ansible_shell_executable to /bin/sh 44071 1727204642.41889: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204642.41921: variable 'ansible_shell_executable' from source: unknown 44071 1727204642.41929: variable 'ansible_connection' from source: unknown 44071 1727204642.41935: variable 'ansible_module_compression' from source: unknown 44071 1727204642.41957: variable 'ansible_shell_type' from source: unknown 44071 1727204642.41960: variable 'ansible_shell_executable' from source: unknown 44071 1727204642.41962: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.41964: variable 'ansible_pipelining' from source: unknown 44071 1727204642.42066: variable 'ansible_timeout' from source: unknown 44071 1727204642.42072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.42161: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204642.42188: variable 'omit' from source: magic vars 44071 1727204642.42201: starting attempt loop 44071 1727204642.42209: running the handler 44071 1727204642.42362: variable 'lsr_net_profile_exists' from source: set_fact 44071 1727204642.42378: Evaluated conditional (lsr_net_profile_exists): True 44071 1727204642.42395: handler run complete 44071 1727204642.42420: attempt loop complete, returning result 44071 1727204642.42430: _execute() done 44071 1727204642.42439: dumping result to json 44071 1727204642.42449: done dumping result, returning 44071 1727204642.42464: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'statebr' [127b8e07-fff9-c964-7471-000000000e8c] 44071 1727204642.42478: sending task result for task 127b8e07-fff9-c964-7471-000000000e8c ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204642.42664: no more pending results, returning what we have 44071 1727204642.42670: results queue empty 44071 1727204642.42672: checking for any_errors_fatal 44071 1727204642.42680: done checking for any_errors_fatal 44071 1727204642.42681: checking for max_fail_percentage 44071 1727204642.42683: done checking for max_fail_percentage 44071 1727204642.42685: checking to see if all hosts have failed and the running result is not ok 44071 1727204642.42685: done checking to see if all hosts have failed 44071 1727204642.42686: getting the remaining hosts for this loop 44071 1727204642.42688: done getting the remaining hosts for this loop 44071 1727204642.42693: getting the next task for host managed-node2 44071 1727204642.42704: done getting next task for host managed-node2 44071 1727204642.42708: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 44071 1727204642.42712: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204642.42718: getting variables 44071 1727204642.42720: in VariableManager get_vars() 44071 1727204642.42763: Calling all_inventory to load vars for managed-node2 44071 1727204642.42973: Calling groups_inventory to load vars for managed-node2 44071 1727204642.42979: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204642.42994: Calling all_plugins_play to load vars for managed-node2 44071 1727204642.42998: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204642.43002: Calling groups_plugins_play to load vars for managed-node2 44071 1727204642.48559: done sending task result for task 127b8e07-fff9-c964-7471-000000000e8c 44071 1727204642.48564: WORKER PROCESS EXITING 44071 1727204642.49262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204642.50819: done with get_vars() 44071 1727204642.50858: done getting variables 44071 1727204642.50914: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204642.51022: variable 'profile' from source: play vars 44071 1727204642.51026: variable 'interface' from source: play vars 44071 1727204642.51091: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.116) 0:00:54.827 ***** 44071 1727204642.51125: entering _queue_task() for managed-node2/assert 44071 1727204642.51531: worker is 1 (out of 1 available) 44071 1727204642.51546: exiting _queue_task() for managed-node2/assert 44071 1727204642.51561: done queuing things up, now waiting for results queue to drain 44071 1727204642.51564: waiting for pending results... 44071 1727204642.51786: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'statebr' 44071 1727204642.51888: in run() - task 127b8e07-fff9-c964-7471-000000000e8d 44071 1727204642.51902: variable 'ansible_search_path' from source: unknown 44071 1727204642.51907: variable 'ansible_search_path' from source: unknown 44071 1727204642.51942: calling self._execute() 44071 1727204642.52035: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.52039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.52052: variable 'omit' from source: magic vars 44071 1727204642.52376: variable 'ansible_distribution_major_version' from source: facts 44071 1727204642.52388: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204642.52394: variable 'omit' from source: magic vars 44071 1727204642.52436: variable 'omit' from source: magic vars 44071 1727204642.52525: variable 'profile' from source: play vars 44071 1727204642.52528: variable 'interface' from source: play vars 44071 1727204642.52584: variable 'interface' from source: play vars 44071 1727204642.52602: variable 'omit' from source: magic vars 44071 1727204642.52637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204642.52670: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204642.52691: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204642.52708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204642.52718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204642.52744: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204642.52751: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.52754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.52835: Set connection var ansible_connection to ssh 44071 1727204642.52841: Set connection var ansible_timeout to 10 44071 1727204642.52849: Set connection var ansible_pipelining to False 44071 1727204642.52855: Set connection var ansible_shell_type to sh 44071 1727204642.52860: Set connection var ansible_shell_executable to /bin/sh 44071 1727204642.52869: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204642.52887: variable 'ansible_shell_executable' from source: unknown 44071 1727204642.52890: variable 'ansible_connection' from source: unknown 44071 1727204642.52895: variable 'ansible_module_compression' from source: unknown 44071 1727204642.52898: variable 'ansible_shell_type' from source: unknown 44071 1727204642.52900: variable 'ansible_shell_executable' from source: unknown 44071 1727204642.52903: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.52905: variable 'ansible_pipelining' from source: unknown 44071 1727204642.52909: variable 'ansible_timeout' from source: unknown 44071 1727204642.52914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.53034: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204642.53038: variable 'omit' from source: magic vars 44071 1727204642.53041: starting attempt loop 44071 1727204642.53047: running the handler 44071 1727204642.53134: variable 'lsr_net_profile_ansible_managed' from source: set_fact 44071 1727204642.53138: Evaluated conditional (lsr_net_profile_ansible_managed): True 44071 1727204642.53148: handler run complete 44071 1727204642.53161: attempt loop complete, returning result 44071 1727204642.53164: _execute() done 44071 1727204642.53168: dumping result to json 44071 1727204642.53171: done dumping result, returning 44071 1727204642.53178: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'statebr' [127b8e07-fff9-c964-7471-000000000e8d] 44071 1727204642.53183: sending task result for task 127b8e07-fff9-c964-7471-000000000e8d ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204642.53340: no more pending results, returning what we have 44071 1727204642.53344: results queue empty 44071 1727204642.53345: checking for any_errors_fatal 44071 1727204642.53359: done checking for any_errors_fatal 44071 1727204642.53360: checking for max_fail_percentage 44071 1727204642.53361: done checking for max_fail_percentage 44071 1727204642.53362: checking to see if all hosts have failed and the running result is not ok 44071 1727204642.53363: done checking to see if all hosts have failed 44071 1727204642.53364: getting the remaining hosts for this loop 44071 1727204642.53365: done getting the remaining hosts for this loop 44071 1727204642.53372: getting the next task for host managed-node2 44071 1727204642.53380: done getting next task for host managed-node2 44071 1727204642.53383: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 44071 1727204642.53387: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204642.53393: getting variables 44071 1727204642.53394: in VariableManager get_vars() 44071 1727204642.53433: Calling all_inventory to load vars for managed-node2 44071 1727204642.53436: Calling groups_inventory to load vars for managed-node2 44071 1727204642.53439: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204642.53453: Calling all_plugins_play to load vars for managed-node2 44071 1727204642.53456: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204642.53459: Calling groups_plugins_play to load vars for managed-node2 44071 1727204642.53477: done sending task result for task 127b8e07-fff9-c964-7471-000000000e8d 44071 1727204642.53480: WORKER PROCESS EXITING 44071 1727204642.54623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204642.56699: done with get_vars() 44071 1727204642.56740: done getting variables 44071 1727204642.56811: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204642.56940: variable 'profile' from source: play vars 44071 1727204642.56947: variable 'interface' from source: play vars 44071 1727204642.57019: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.059) 0:00:54.887 ***** 44071 1727204642.57057: entering _queue_task() for managed-node2/assert 44071 1727204642.57471: worker is 1 (out of 1 available) 44071 1727204642.57488: exiting _queue_task() for managed-node2/assert 44071 1727204642.57501: done queuing things up, now waiting for results queue to drain 44071 1727204642.57503: waiting for pending results... 44071 1727204642.57742: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in statebr 44071 1727204642.57895: in run() - task 127b8e07-fff9-c964-7471-000000000e8e 44071 1727204642.57971: variable 'ansible_search_path' from source: unknown 44071 1727204642.57976: variable 'ansible_search_path' from source: unknown 44071 1727204642.57980: calling self._execute() 44071 1727204642.58092: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.58108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.58122: variable 'omit' from source: magic vars 44071 1727204642.58551: variable 'ansible_distribution_major_version' from source: facts 44071 1727204642.58574: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204642.58618: variable 'omit' from source: magic vars 44071 1727204642.58647: variable 'omit' from source: magic vars 44071 1727204642.58767: variable 'profile' from source: play vars 44071 1727204642.58778: variable 'interface' from source: play vars 44071 1727204642.58858: variable 'interface' from source: play vars 44071 1727204642.58944: variable 'omit' from source: magic vars 44071 1727204642.58947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204642.58984: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204642.59012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204642.59035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204642.59057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204642.59095: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204642.59103: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.59111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.59227: Set connection var ansible_connection to ssh 44071 1727204642.59239: Set connection var ansible_timeout to 10 44071 1727204642.59249: Set connection var ansible_pipelining to False 44071 1727204642.59268: Set connection var ansible_shell_type to sh 44071 1727204642.59271: Set connection var ansible_shell_executable to /bin/sh 44071 1727204642.59371: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204642.59374: variable 'ansible_shell_executable' from source: unknown 44071 1727204642.59378: variable 'ansible_connection' from source: unknown 44071 1727204642.59381: variable 'ansible_module_compression' from source: unknown 44071 1727204642.59383: variable 'ansible_shell_type' from source: unknown 44071 1727204642.59385: variable 'ansible_shell_executable' from source: unknown 44071 1727204642.59387: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.59389: variable 'ansible_pipelining' from source: unknown 44071 1727204642.59391: variable 'ansible_timeout' from source: unknown 44071 1727204642.59393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.59573: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204642.59576: variable 'omit' from source: magic vars 44071 1727204642.59579: starting attempt loop 44071 1727204642.59581: running the handler 44071 1727204642.59693: variable 'lsr_net_profile_fingerprint' from source: set_fact 44071 1727204642.59704: Evaluated conditional (lsr_net_profile_fingerprint): True 44071 1727204642.59715: handler run complete 44071 1727204642.59740: attempt loop complete, returning result 44071 1727204642.59835: _execute() done 44071 1727204642.59839: dumping result to json 44071 1727204642.59841: done dumping result, returning 44071 1727204642.59844: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in statebr [127b8e07-fff9-c964-7471-000000000e8e] 44071 1727204642.59846: sending task result for task 127b8e07-fff9-c964-7471-000000000e8e 44071 1727204642.59926: done sending task result for task 127b8e07-fff9-c964-7471-000000000e8e 44071 1727204642.59930: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204642.59988: no more pending results, returning what we have 44071 1727204642.59991: results queue empty 44071 1727204642.59992: checking for any_errors_fatal 44071 1727204642.60000: done checking for any_errors_fatal 44071 1727204642.60000: checking for max_fail_percentage 44071 1727204642.60002: done checking for max_fail_percentage 44071 1727204642.60003: checking to see if all hosts have failed and the running result is not ok 44071 1727204642.60004: done checking to see if all hosts have failed 44071 1727204642.60005: getting the remaining hosts for this loop 44071 1727204642.60006: done getting the remaining hosts for this loop 44071 1727204642.60011: getting the next task for host managed-node2 44071 1727204642.60021: done getting next task for host managed-node2 44071 1727204642.60026: ^ task is: TASK: Conditional asserts 44071 1727204642.60029: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204642.60033: getting variables 44071 1727204642.60035: in VariableManager get_vars() 44071 1727204642.60077: Calling all_inventory to load vars for managed-node2 44071 1727204642.60080: Calling groups_inventory to load vars for managed-node2 44071 1727204642.60084: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204642.60097: Calling all_plugins_play to load vars for managed-node2 44071 1727204642.60100: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204642.60103: Calling groups_plugins_play to load vars for managed-node2 44071 1727204642.61941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204642.64456: done with get_vars() 44071 1727204642.64488: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.075) 0:00:54.962 ***** 44071 1727204642.64595: entering _queue_task() for managed-node2/include_tasks 44071 1727204642.65342: worker is 1 (out of 1 available) 44071 1727204642.65360: exiting _queue_task() for managed-node2/include_tasks 44071 1727204642.65508: done queuing things up, now waiting for results queue to drain 44071 1727204642.65510: waiting for pending results... 44071 1727204642.66064: running TaskExecutor() for managed-node2/TASK: Conditional asserts 44071 1727204642.66317: in run() - task 127b8e07-fff9-c964-7471-000000000a4f 44071 1727204642.66538: variable 'ansible_search_path' from source: unknown 44071 1727204642.66544: variable 'ansible_search_path' from source: unknown 44071 1727204642.67177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204642.70679: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204642.70683: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204642.70898: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204642.70945: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204642.71034: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204642.71336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204642.71378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204642.71411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204642.71595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204642.71615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204642.71912: dumping result to json 44071 1727204642.71980: done dumping result, returning 44071 1727204642.71998: done running TaskExecutor() for managed-node2/TASK: Conditional asserts [127b8e07-fff9-c964-7471-000000000a4f] 44071 1727204642.72009: sending task result for task 127b8e07-fff9-c964-7471-000000000a4f 44071 1727204642.72298: done sending task result for task 127b8e07-fff9-c964-7471-000000000a4f 44071 1727204642.72302: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } 44071 1727204642.72370: no more pending results, returning what we have 44071 1727204642.72374: results queue empty 44071 1727204642.72376: checking for any_errors_fatal 44071 1727204642.72384: done checking for any_errors_fatal 44071 1727204642.72385: checking for max_fail_percentage 44071 1727204642.72387: done checking for max_fail_percentage 44071 1727204642.72388: checking to see if all hosts have failed and the running result is not ok 44071 1727204642.72389: done checking to see if all hosts have failed 44071 1727204642.72390: getting the remaining hosts for this loop 44071 1727204642.72391: done getting the remaining hosts for this loop 44071 1727204642.72397: getting the next task for host managed-node2 44071 1727204642.72406: done getting next task for host managed-node2 44071 1727204642.72410: ^ task is: TASK: Success in test '{{ lsr_description }}' 44071 1727204642.72413: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204642.72423: getting variables 44071 1727204642.72426: in VariableManager get_vars() 44071 1727204642.72671: Calling all_inventory to load vars for managed-node2 44071 1727204642.72675: Calling groups_inventory to load vars for managed-node2 44071 1727204642.72679: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204642.72692: Calling all_plugins_play to load vars for managed-node2 44071 1727204642.72696: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204642.72699: Calling groups_plugins_play to load vars for managed-node2 44071 1727204642.77157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204642.79925: done with get_vars() 44071 1727204642.79977: done getting variables 44071 1727204642.80047: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204642.80226: variable 'lsr_description' from source: include params TASK [Success in test 'I can activate an existing profile'] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.156) 0:00:55.119 ***** 44071 1727204642.80261: entering _queue_task() for managed-node2/debug 44071 1727204642.81032: worker is 1 (out of 1 available) 44071 1727204642.81049: exiting _queue_task() for managed-node2/debug 44071 1727204642.81188: done queuing things up, now waiting for results queue to drain 44071 1727204642.81191: waiting for pending results... 44071 1727204642.81792: running TaskExecutor() for managed-node2/TASK: Success in test 'I can activate an existing profile' 44071 1727204642.81973: in run() - task 127b8e07-fff9-c964-7471-000000000a50 44071 1727204642.81979: variable 'ansible_search_path' from source: unknown 44071 1727204642.81982: variable 'ansible_search_path' from source: unknown 44071 1727204642.82017: calling self._execute() 44071 1727204642.82157: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.82210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.82214: variable 'omit' from source: magic vars 44071 1727204642.82656: variable 'ansible_distribution_major_version' from source: facts 44071 1727204642.82679: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204642.82690: variable 'omit' from source: magic vars 44071 1727204642.82751: variable 'omit' from source: magic vars 44071 1727204642.82877: variable 'lsr_description' from source: include params 44071 1727204642.82916: variable 'omit' from source: magic vars 44071 1727204642.82972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204642.83010: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204642.83133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204642.83136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204642.83139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204642.83145: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204642.83148: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.83150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.83261: Set connection var ansible_connection to ssh 44071 1727204642.83275: Set connection var ansible_timeout to 10 44071 1727204642.83286: Set connection var ansible_pipelining to False 44071 1727204642.83295: Set connection var ansible_shell_type to sh 44071 1727204642.83305: Set connection var ansible_shell_executable to /bin/sh 44071 1727204642.83316: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204642.83353: variable 'ansible_shell_executable' from source: unknown 44071 1727204642.83361: variable 'ansible_connection' from source: unknown 44071 1727204642.83370: variable 'ansible_module_compression' from source: unknown 44071 1727204642.83376: variable 'ansible_shell_type' from source: unknown 44071 1727204642.83383: variable 'ansible_shell_executable' from source: unknown 44071 1727204642.83389: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.83397: variable 'ansible_pipelining' from source: unknown 44071 1727204642.83403: variable 'ansible_timeout' from source: unknown 44071 1727204642.83411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.83585: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204642.83673: variable 'omit' from source: magic vars 44071 1727204642.83676: starting attempt loop 44071 1727204642.83678: running the handler 44071 1727204642.83682: handler run complete 44071 1727204642.83703: attempt loop complete, returning result 44071 1727204642.83709: _execute() done 44071 1727204642.83717: dumping result to json 44071 1727204642.83723: done dumping result, returning 44071 1727204642.83734: done running TaskExecutor() for managed-node2/TASK: Success in test 'I can activate an existing profile' [127b8e07-fff9-c964-7471-000000000a50] 44071 1727204642.83745: sending task result for task 127b8e07-fff9-c964-7471-000000000a50 44071 1727204642.84263: done sending task result for task 127b8e07-fff9-c964-7471-000000000a50 44071 1727204642.84274: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: +++++ Success in test 'I can activate an existing profile' +++++ 44071 1727204642.84334: no more pending results, returning what we have 44071 1727204642.84338: results queue empty 44071 1727204642.84338: checking for any_errors_fatal 44071 1727204642.84345: done checking for any_errors_fatal 44071 1727204642.84346: checking for max_fail_percentage 44071 1727204642.84347: done checking for max_fail_percentage 44071 1727204642.84349: checking to see if all hosts have failed and the running result is not ok 44071 1727204642.84349: done checking to see if all hosts have failed 44071 1727204642.84350: getting the remaining hosts for this loop 44071 1727204642.84352: done getting the remaining hosts for this loop 44071 1727204642.84357: getting the next task for host managed-node2 44071 1727204642.84367: done getting next task for host managed-node2 44071 1727204642.84371: ^ task is: TASK: Cleanup 44071 1727204642.84374: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204642.84380: getting variables 44071 1727204642.84381: in VariableManager get_vars() 44071 1727204642.84425: Calling all_inventory to load vars for managed-node2 44071 1727204642.84429: Calling groups_inventory to load vars for managed-node2 44071 1727204642.84433: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204642.84449: Calling all_plugins_play to load vars for managed-node2 44071 1727204642.84453: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204642.84457: Calling groups_plugins_play to load vars for managed-node2 44071 1727204642.86720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204642.88895: done with get_vars() 44071 1727204642.88941: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 15:04:02 -0400 (0:00:00.088) 0:00:55.207 ***** 44071 1727204642.89144: entering _queue_task() for managed-node2/include_tasks 44071 1727204642.89605: worker is 1 (out of 1 available) 44071 1727204642.89618: exiting _queue_task() for managed-node2/include_tasks 44071 1727204642.89635: done queuing things up, now waiting for results queue to drain 44071 1727204642.89636: waiting for pending results... 44071 1727204642.90068: running TaskExecutor() for managed-node2/TASK: Cleanup 44071 1727204642.90120: in run() - task 127b8e07-fff9-c964-7471-000000000a54 44071 1727204642.90147: variable 'ansible_search_path' from source: unknown 44071 1727204642.90155: variable 'ansible_search_path' from source: unknown 44071 1727204642.90219: variable 'lsr_cleanup' from source: include params 44071 1727204642.90466: variable 'lsr_cleanup' from source: include params 44071 1727204642.90549: variable 'omit' from source: magic vars 44071 1727204642.90733: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204642.90755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204642.90776: variable 'omit' from source: magic vars 44071 1727204642.91058: variable 'ansible_distribution_major_version' from source: facts 44071 1727204642.91166: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204642.91173: variable 'item' from source: unknown 44071 1727204642.91176: variable 'item' from source: unknown 44071 1727204642.91212: variable 'item' from source: unknown 44071 1727204642.91282: variable 'item' from source: unknown 44071 1727204642.91664: dumping result to json 44071 1727204642.91670: done dumping result, returning 44071 1727204642.91672: done running TaskExecutor() for managed-node2/TASK: Cleanup [127b8e07-fff9-c964-7471-000000000a54] 44071 1727204642.91675: sending task result for task 127b8e07-fff9-c964-7471-000000000a54 44071 1727204642.91729: done sending task result for task 127b8e07-fff9-c964-7471-000000000a54 44071 1727204642.91732: WORKER PROCESS EXITING 44071 1727204642.91760: no more pending results, returning what we have 44071 1727204642.91767: in VariableManager get_vars() 44071 1727204642.91818: Calling all_inventory to load vars for managed-node2 44071 1727204642.91822: Calling groups_inventory to load vars for managed-node2 44071 1727204642.91827: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204642.91847: Calling all_plugins_play to load vars for managed-node2 44071 1727204642.91852: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204642.91856: Calling groups_plugins_play to load vars for managed-node2 44071 1727204642.94692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204642.96025: done with get_vars() 44071 1727204642.96044: variable 'ansible_search_path' from source: unknown 44071 1727204642.96046: variable 'ansible_search_path' from source: unknown 44071 1727204642.96085: we have included files to process 44071 1727204642.96086: generating all_blocks data 44071 1727204642.96088: done generating all_blocks data 44071 1727204642.96096: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204642.96097: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204642.96100: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204642.96369: done processing included file 44071 1727204642.96372: iterating over new_blocks loaded from include file 44071 1727204642.96374: in VariableManager get_vars() 44071 1727204642.96396: done with get_vars() 44071 1727204642.96415: filtering new block on tags 44071 1727204642.96450: done filtering new block on tags 44071 1727204642.96453: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed-node2 => (item=tasks/cleanup_profile+device.yml) 44071 1727204642.96459: extending task lists for all hosts with included blocks 44071 1727204642.98351: done extending task lists 44071 1727204642.98352: done processing included files 44071 1727204642.98353: results queue empty 44071 1727204642.98353: checking for any_errors_fatal 44071 1727204642.98357: done checking for any_errors_fatal 44071 1727204642.98358: checking for max_fail_percentage 44071 1727204642.98358: done checking for max_fail_percentage 44071 1727204642.98359: checking to see if all hosts have failed and the running result is not ok 44071 1727204642.98360: done checking to see if all hosts have failed 44071 1727204642.98360: getting the remaining hosts for this loop 44071 1727204642.98361: done getting the remaining hosts for this loop 44071 1727204642.98363: getting the next task for host managed-node2 44071 1727204642.98368: done getting next task for host managed-node2 44071 1727204642.98370: ^ task is: TASK: Cleanup profile and device 44071 1727204642.98372: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204642.98374: getting variables 44071 1727204642.98375: in VariableManager get_vars() 44071 1727204642.98387: Calling all_inventory to load vars for managed-node2 44071 1727204642.98389: Calling groups_inventory to load vars for managed-node2 44071 1727204642.98391: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204642.98396: Calling all_plugins_play to load vars for managed-node2 44071 1727204642.98398: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204642.98400: Calling groups_plugins_play to load vars for managed-node2 44071 1727204642.99330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204643.00796: done with get_vars() 44071 1727204643.00841: done getting variables 44071 1727204643.00897: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.117) 0:00:55.325 ***** 44071 1727204643.00932: entering _queue_task() for managed-node2/shell 44071 1727204643.01345: worker is 1 (out of 1 available) 44071 1727204643.01362: exiting _queue_task() for managed-node2/shell 44071 1727204643.01379: done queuing things up, now waiting for results queue to drain 44071 1727204643.01381: waiting for pending results... 44071 1727204643.01791: running TaskExecutor() for managed-node2/TASK: Cleanup profile and device 44071 1727204643.01802: in run() - task 127b8e07-fff9-c964-7471-000000000f6d 44071 1727204643.01820: variable 'ansible_search_path' from source: unknown 44071 1727204643.01824: variable 'ansible_search_path' from source: unknown 44071 1727204643.01905: calling self._execute() 44071 1727204643.02059: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.02064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.02071: variable 'omit' from source: magic vars 44071 1727204643.02444: variable 'ansible_distribution_major_version' from source: facts 44071 1727204643.02458: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204643.02464: variable 'omit' from source: magic vars 44071 1727204643.02507: variable 'omit' from source: magic vars 44071 1727204643.02626: variable 'interface' from source: play vars 44071 1727204643.02644: variable 'omit' from source: magic vars 44071 1727204643.02685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204643.02719: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204643.02737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204643.02756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.02768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.02793: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204643.02796: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.02799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.02880: Set connection var ansible_connection to ssh 44071 1727204643.02886: Set connection var ansible_timeout to 10 44071 1727204643.02892: Set connection var ansible_pipelining to False 44071 1727204643.02897: Set connection var ansible_shell_type to sh 44071 1727204643.02903: Set connection var ansible_shell_executable to /bin/sh 44071 1727204643.02910: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204643.02931: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.02935: variable 'ansible_connection' from source: unknown 44071 1727204643.02938: variable 'ansible_module_compression' from source: unknown 44071 1727204643.02940: variable 'ansible_shell_type' from source: unknown 44071 1727204643.02943: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.02947: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.02952: variable 'ansible_pipelining' from source: unknown 44071 1727204643.02954: variable 'ansible_timeout' from source: unknown 44071 1727204643.02959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.03081: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204643.03091: variable 'omit' from source: magic vars 44071 1727204643.03096: starting attempt loop 44071 1727204643.03099: running the handler 44071 1727204643.03109: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204643.03126: _low_level_execute_command(): starting 44071 1727204643.03133: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204643.03706: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204643.03710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204643.03714: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204643.03755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204643.03771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204643.03857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204643.05633: stdout chunk (state=3): >>>/root <<< 44071 1727204643.05850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204643.05854: stdout chunk (state=3): >>><<< 44071 1727204643.05857: stderr chunk (state=3): >>><<< 44071 1727204643.05885: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204643.05908: _low_level_execute_command(): starting 44071 1727204643.06010: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204643.0589256-47198-131413961254461 `" && echo ansible-tmp-1727204643.0589256-47198-131413961254461="` echo /root/.ansible/tmp/ansible-tmp-1727204643.0589256-47198-131413961254461 `" ) && sleep 0' 44071 1727204643.06543: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204643.06569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204643.06622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204643.06635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204643.06711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204643.08724: stdout chunk (state=3): >>>ansible-tmp-1727204643.0589256-47198-131413961254461=/root/.ansible/tmp/ansible-tmp-1727204643.0589256-47198-131413961254461 <<< 44071 1727204643.08935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204643.08939: stdout chunk (state=3): >>><<< 44071 1727204643.08944: stderr chunk (state=3): >>><<< 44071 1727204643.08969: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204643.0589256-47198-131413961254461=/root/.ansible/tmp/ansible-tmp-1727204643.0589256-47198-131413961254461 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204643.09105: variable 'ansible_module_compression' from source: unknown 44071 1727204643.09114: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204643.09176: variable 'ansible_facts' from source: unknown 44071 1727204643.09235: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204643.0589256-47198-131413961254461/AnsiballZ_command.py 44071 1727204643.09354: Sending initial data 44071 1727204643.09358: Sent initial data (156 bytes) 44071 1727204643.09837: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204643.09842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204643.09881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204643.09885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204643.09935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204643.09939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204643.09946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204643.10018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204643.11628: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204643.11699: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204643.11763: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpcpap7jaf /root/.ansible/tmp/ansible-tmp-1727204643.0589256-47198-131413961254461/AnsiballZ_command.py <<< 44071 1727204643.11772: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204643.0589256-47198-131413961254461/AnsiballZ_command.py" <<< 44071 1727204643.11839: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpcpap7jaf" to remote "/root/.ansible/tmp/ansible-tmp-1727204643.0589256-47198-131413961254461/AnsiballZ_command.py" <<< 44071 1727204643.11842: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204643.0589256-47198-131413961254461/AnsiballZ_command.py" <<< 44071 1727204643.12511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204643.12593: stderr chunk (state=3): >>><<< 44071 1727204643.12597: stdout chunk (state=3): >>><<< 44071 1727204643.12617: done transferring module to remote 44071 1727204643.12628: _low_level_execute_command(): starting 44071 1727204643.12633: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204643.0589256-47198-131413961254461/ /root/.ansible/tmp/ansible-tmp-1727204643.0589256-47198-131413961254461/AnsiballZ_command.py && sleep 0' 44071 1727204643.13138: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204643.13145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204643.13148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204643.13153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204643.13198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204643.13201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204643.13204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204643.13285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204643.15108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204643.15170: stderr chunk (state=3): >>><<< 44071 1727204643.15174: stdout chunk (state=3): >>><<< 44071 1727204643.15192: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204643.15196: _low_level_execute_command(): starting 44071 1727204643.15200: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204643.0589256-47198-131413961254461/AnsiballZ_command.py && sleep 0' 44071 1727204643.15698: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204643.15702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204643.15705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204643.15707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204643.15757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204643.15761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204643.15777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204643.15855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204643.37810: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (8a139112-7ef3-44ae-a404-065d84fc2b3c) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:04:03.322535", "end": "2024-09-24 15:04:03.373805", "delta": "0:00:00.051270", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204643.40973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204643.40978: stdout chunk (state=3): >>><<< 44071 1727204643.40981: stderr chunk (state=3): >>><<< 44071 1727204643.40985: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Connection 'statebr' (8a139112-7ef3-44ae-a404-065d84fc2b3c) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:04:03.322535", "end": "2024-09-24 15:04:03.373805", "delta": "0:00:00.051270", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204643.40987: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204643.0589256-47198-131413961254461/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204643.40989: _low_level_execute_command(): starting 44071 1727204643.40991: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204643.0589256-47198-131413961254461/ > /dev/null 2>&1 && sleep 0' 44071 1727204643.41611: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204643.41622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204643.41644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204643.41658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204643.41677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204643.41689: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204643.41752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204643.41800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204643.41815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204643.41833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204643.41942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204643.43954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204643.43958: stdout chunk (state=3): >>><<< 44071 1727204643.43968: stderr chunk (state=3): >>><<< 44071 1727204643.43999: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204643.44029: handler run complete 44071 1727204643.44034: Evaluated conditional (False): False 44071 1727204643.44048: attempt loop complete, returning result 44071 1727204643.44051: _execute() done 44071 1727204643.44076: dumping result to json 44071 1727204643.44078: done dumping result, returning 44071 1727204643.44081: done running TaskExecutor() for managed-node2/TASK: Cleanup profile and device [127b8e07-fff9-c964-7471-000000000f6d] 44071 1727204643.44083: sending task result for task 127b8e07-fff9-c964-7471-000000000f6d 44071 1727204643.44312: done sending task result for task 127b8e07-fff9-c964-7471-000000000f6d 44071 1727204643.44316: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.051270", "end": "2024-09-24 15:04:03.373805", "rc": 0, "start": "2024-09-24 15:04:03.322535" } STDOUT: Connection 'statebr' (8a139112-7ef3-44ae-a404-065d84fc2b3c) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' 44071 1727204643.44400: no more pending results, returning what we have 44071 1727204643.44404: results queue empty 44071 1727204643.44405: checking for any_errors_fatal 44071 1727204643.44407: done checking for any_errors_fatal 44071 1727204643.44408: checking for max_fail_percentage 44071 1727204643.44410: done checking for max_fail_percentage 44071 1727204643.44411: checking to see if all hosts have failed and the running result is not ok 44071 1727204643.44412: done checking to see if all hosts have failed 44071 1727204643.44412: getting the remaining hosts for this loop 44071 1727204643.44414: done getting the remaining hosts for this loop 44071 1727204643.44418: getting the next task for host managed-node2 44071 1727204643.44431: done getting next task for host managed-node2 44071 1727204643.44434: ^ task is: TASK: Include the task 'run_test.yml' 44071 1727204643.44436: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204643.44440: getting variables 44071 1727204643.44442: in VariableManager get_vars() 44071 1727204643.44482: Calling all_inventory to load vars for managed-node2 44071 1727204643.44485: Calling groups_inventory to load vars for managed-node2 44071 1727204643.44489: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204643.44693: Calling all_plugins_play to load vars for managed-node2 44071 1727204643.44697: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204643.44702: Calling groups_plugins_play to load vars for managed-node2 44071 1727204643.46897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204643.50343: done with get_vars() 44071 1727204643.50387: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:83 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.498) 0:00:55.824 ***** 44071 1727204643.50779: entering _queue_task() for managed-node2/include_tasks 44071 1727204643.51361: worker is 1 (out of 1 available) 44071 1727204643.51377: exiting _queue_task() for managed-node2/include_tasks 44071 1727204643.51390: done queuing things up, now waiting for results queue to drain 44071 1727204643.51392: waiting for pending results... 44071 1727204643.51785: running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' 44071 1727204643.51793: in run() - task 127b8e07-fff9-c964-7471-000000000013 44071 1727204643.51797: variable 'ansible_search_path' from source: unknown 44071 1727204643.51981: calling self._execute() 44071 1727204643.51985: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.51989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.51993: variable 'omit' from source: magic vars 44071 1727204643.52573: variable 'ansible_distribution_major_version' from source: facts 44071 1727204643.52577: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204643.52580: _execute() done 44071 1727204643.52583: dumping result to json 44071 1727204643.52585: done dumping result, returning 44071 1727204643.52587: done running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' [127b8e07-fff9-c964-7471-000000000013] 44071 1727204643.52588: sending task result for task 127b8e07-fff9-c964-7471-000000000013 44071 1727204643.52682: done sending task result for task 127b8e07-fff9-c964-7471-000000000013 44071 1727204643.52686: WORKER PROCESS EXITING 44071 1727204643.52719: no more pending results, returning what we have 44071 1727204643.52725: in VariableManager get_vars() 44071 1727204643.52779: Calling all_inventory to load vars for managed-node2 44071 1727204643.52783: Calling groups_inventory to load vars for managed-node2 44071 1727204643.52787: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204643.52804: Calling all_plugins_play to load vars for managed-node2 44071 1727204643.52809: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204643.52813: Calling groups_plugins_play to load vars for managed-node2 44071 1727204643.54959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204643.57347: done with get_vars() 44071 1727204643.57382: variable 'ansible_search_path' from source: unknown 44071 1727204643.57399: we have included files to process 44071 1727204643.57400: generating all_blocks data 44071 1727204643.57403: done generating all_blocks data 44071 1727204643.57409: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204643.57410: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204643.57413: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204643.57899: in VariableManager get_vars() 44071 1727204643.57921: done with get_vars() 44071 1727204643.57970: in VariableManager get_vars() 44071 1727204643.57995: done with get_vars() 44071 1727204643.58043: in VariableManager get_vars() 44071 1727204643.58061: done with get_vars() 44071 1727204643.58112: in VariableManager get_vars() 44071 1727204643.58131: done with get_vars() 44071 1727204643.58179: in VariableManager get_vars() 44071 1727204643.58196: done with get_vars() 44071 1727204643.58646: in VariableManager get_vars() 44071 1727204643.58665: done with get_vars() 44071 1727204643.58681: done processing included file 44071 1727204643.58683: iterating over new_blocks loaded from include file 44071 1727204643.58684: in VariableManager get_vars() 44071 1727204643.58696: done with get_vars() 44071 1727204643.58697: filtering new block on tags 44071 1727204643.58814: done filtering new block on tags 44071 1727204643.58817: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node2 44071 1727204643.58823: extending task lists for all hosts with included blocks 44071 1727204643.58868: done extending task lists 44071 1727204643.58870: done processing included files 44071 1727204643.58870: results queue empty 44071 1727204643.58871: checking for any_errors_fatal 44071 1727204643.58877: done checking for any_errors_fatal 44071 1727204643.58878: checking for max_fail_percentage 44071 1727204643.58879: done checking for max_fail_percentage 44071 1727204643.58880: checking to see if all hosts have failed and the running result is not ok 44071 1727204643.58880: done checking to see if all hosts have failed 44071 1727204643.58881: getting the remaining hosts for this loop 44071 1727204643.58882: done getting the remaining hosts for this loop 44071 1727204643.58885: getting the next task for host managed-node2 44071 1727204643.58889: done getting next task for host managed-node2 44071 1727204643.58891: ^ task is: TASK: TEST: {{ lsr_description }} 44071 1727204643.58894: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204643.58896: getting variables 44071 1727204643.58897: in VariableManager get_vars() 44071 1727204643.58906: Calling all_inventory to load vars for managed-node2 44071 1727204643.58909: Calling groups_inventory to load vars for managed-node2 44071 1727204643.58911: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204643.58917: Calling all_plugins_play to load vars for managed-node2 44071 1727204643.58919: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204643.58921: Calling groups_plugins_play to load vars for managed-node2 44071 1727204643.60524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204643.62761: done with get_vars() 44071 1727204643.62807: done getting variables 44071 1727204643.62861: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204643.62997: variable 'lsr_description' from source: include params TASK [TEST: I can remove an existing profile without taking it down] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.122) 0:00:55.946 ***** 44071 1727204643.63029: entering _queue_task() for managed-node2/debug 44071 1727204643.63458: worker is 1 (out of 1 available) 44071 1727204643.63476: exiting _queue_task() for managed-node2/debug 44071 1727204643.63490: done queuing things up, now waiting for results queue to drain 44071 1727204643.63492: waiting for pending results... 44071 1727204643.63787: running TaskExecutor() for managed-node2/TASK: TEST: I can remove an existing profile without taking it down 44071 1727204643.63923: in run() - task 127b8e07-fff9-c964-7471-000000001005 44071 1727204643.64071: variable 'ansible_search_path' from source: unknown 44071 1727204643.64075: variable 'ansible_search_path' from source: unknown 44071 1727204643.64078: calling self._execute() 44071 1727204643.64115: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.64128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.64145: variable 'omit' from source: magic vars 44071 1727204643.64556: variable 'ansible_distribution_major_version' from source: facts 44071 1727204643.64577: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204643.64589: variable 'omit' from source: magic vars 44071 1727204643.64638: variable 'omit' from source: magic vars 44071 1727204643.64756: variable 'lsr_description' from source: include params 44071 1727204643.64782: variable 'omit' from source: magic vars 44071 1727204643.64833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204643.64883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204643.64911: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204643.64934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.64955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.64992: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204643.65000: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.65008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.65122: Set connection var ansible_connection to ssh 44071 1727204643.65134: Set connection var ansible_timeout to 10 44071 1727204643.65271: Set connection var ansible_pipelining to False 44071 1727204643.65275: Set connection var ansible_shell_type to sh 44071 1727204643.65277: Set connection var ansible_shell_executable to /bin/sh 44071 1727204643.65280: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204643.65286: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.65289: variable 'ansible_connection' from source: unknown 44071 1727204643.65292: variable 'ansible_module_compression' from source: unknown 44071 1727204643.65294: variable 'ansible_shell_type' from source: unknown 44071 1727204643.65296: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.65298: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.65300: variable 'ansible_pipelining' from source: unknown 44071 1727204643.65302: variable 'ansible_timeout' from source: unknown 44071 1727204643.65304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.65414: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204643.65431: variable 'omit' from source: magic vars 44071 1727204643.65444: starting attempt loop 44071 1727204643.65451: running the handler 44071 1727204643.65506: handler run complete 44071 1727204643.65529: attempt loop complete, returning result 44071 1727204643.65540: _execute() done 44071 1727204643.65548: dumping result to json 44071 1727204643.65555: done dumping result, returning 44071 1727204643.65571: done running TaskExecutor() for managed-node2/TASK: TEST: I can remove an existing profile without taking it down [127b8e07-fff9-c964-7471-000000001005] 44071 1727204643.65582: sending task result for task 127b8e07-fff9-c964-7471-000000001005 ok: [managed-node2] => {} MSG: ########## I can remove an existing profile without taking it down ########## 44071 1727204643.65756: no more pending results, returning what we have 44071 1727204643.65760: results queue empty 44071 1727204643.65761: checking for any_errors_fatal 44071 1727204643.65762: done checking for any_errors_fatal 44071 1727204643.65763: checking for max_fail_percentage 44071 1727204643.65767: done checking for max_fail_percentage 44071 1727204643.65768: checking to see if all hosts have failed and the running result is not ok 44071 1727204643.65769: done checking to see if all hosts have failed 44071 1727204643.65770: getting the remaining hosts for this loop 44071 1727204643.65772: done getting the remaining hosts for this loop 44071 1727204643.65777: getting the next task for host managed-node2 44071 1727204643.65786: done getting next task for host managed-node2 44071 1727204643.65788: ^ task is: TASK: Show item 44071 1727204643.65791: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204643.65970: getting variables 44071 1727204643.65972: in VariableManager get_vars() 44071 1727204643.66006: Calling all_inventory to load vars for managed-node2 44071 1727204643.66009: Calling groups_inventory to load vars for managed-node2 44071 1727204643.66012: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204643.66025: Calling all_plugins_play to load vars for managed-node2 44071 1727204643.66028: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204643.66032: Calling groups_plugins_play to load vars for managed-node2 44071 1727204643.66585: done sending task result for task 127b8e07-fff9-c964-7471-000000001005 44071 1727204643.66590: WORKER PROCESS EXITING 44071 1727204643.68006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204643.70347: done with get_vars() 44071 1727204643.70398: done getting variables 44071 1727204643.70473: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.074) 0:00:56.021 ***** 44071 1727204643.70511: entering _queue_task() for managed-node2/debug 44071 1727204643.70944: worker is 1 (out of 1 available) 44071 1727204643.70959: exiting _queue_task() for managed-node2/debug 44071 1727204643.70979: done queuing things up, now waiting for results queue to drain 44071 1727204643.70981: waiting for pending results... 44071 1727204643.71588: running TaskExecutor() for managed-node2/TASK: Show item 44071 1727204643.71689: in run() - task 127b8e07-fff9-c964-7471-000000001006 44071 1727204643.71706: variable 'ansible_search_path' from source: unknown 44071 1727204643.71710: variable 'ansible_search_path' from source: unknown 44071 1727204643.71770: variable 'omit' from source: magic vars 44071 1727204643.71944: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.71954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.71967: variable 'omit' from source: magic vars 44071 1727204643.72375: variable 'ansible_distribution_major_version' from source: facts 44071 1727204643.72395: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204643.72406: variable 'omit' from source: magic vars 44071 1727204643.72455: variable 'omit' from source: magic vars 44071 1727204643.72516: variable 'item' from source: unknown 44071 1727204643.72609: variable 'item' from source: unknown 44071 1727204643.72632: variable 'omit' from source: magic vars 44071 1727204643.72689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204643.72732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204643.72763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204643.72793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.72810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.72848: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204643.72857: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.72865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.72988: Set connection var ansible_connection to ssh 44071 1727204643.73000: Set connection var ansible_timeout to 10 44071 1727204643.73015: Set connection var ansible_pipelining to False 44071 1727204643.73025: Set connection var ansible_shell_type to sh 44071 1727204643.73036: Set connection var ansible_shell_executable to /bin/sh 44071 1727204643.73118: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204643.73121: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.73124: variable 'ansible_connection' from source: unknown 44071 1727204643.73126: variable 'ansible_module_compression' from source: unknown 44071 1727204643.73128: variable 'ansible_shell_type' from source: unknown 44071 1727204643.73130: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.73132: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.73137: variable 'ansible_pipelining' from source: unknown 44071 1727204643.73139: variable 'ansible_timeout' from source: unknown 44071 1727204643.73141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.73292: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204643.73309: variable 'omit' from source: magic vars 44071 1727204643.73320: starting attempt loop 44071 1727204643.73327: running the handler 44071 1727204643.73388: variable 'lsr_description' from source: include params 44071 1727204643.73482: variable 'lsr_description' from source: include params 44071 1727204643.73501: handler run complete 44071 1727204643.73559: attempt loop complete, returning result 44071 1727204643.73577: variable 'item' from source: unknown 44071 1727204643.73820: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can remove an existing profile without taking it down" } 44071 1727204643.74339: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.74342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.74345: variable 'omit' from source: magic vars 44071 1727204643.74460: variable 'ansible_distribution_major_version' from source: facts 44071 1727204643.74464: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204643.74468: variable 'omit' from source: magic vars 44071 1727204643.74470: variable 'omit' from source: magic vars 44071 1727204643.74472: variable 'item' from source: unknown 44071 1727204643.74550: variable 'item' from source: unknown 44071 1727204643.74577: variable 'omit' from source: magic vars 44071 1727204643.74603: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204643.74617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.74629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.74651: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204643.74658: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.74668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.74755: Set connection var ansible_connection to ssh 44071 1727204643.74767: Set connection var ansible_timeout to 10 44071 1727204643.74779: Set connection var ansible_pipelining to False 44071 1727204643.74791: Set connection var ansible_shell_type to sh 44071 1727204643.74800: Set connection var ansible_shell_executable to /bin/sh 44071 1727204643.74894: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204643.74898: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.74900: variable 'ansible_connection' from source: unknown 44071 1727204643.74903: variable 'ansible_module_compression' from source: unknown 44071 1727204643.74905: variable 'ansible_shell_type' from source: unknown 44071 1727204643.74907: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.74909: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.74911: variable 'ansible_pipelining' from source: unknown 44071 1727204643.74913: variable 'ansible_timeout' from source: unknown 44071 1727204643.74915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.74992: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204643.75011: variable 'omit' from source: magic vars 44071 1727204643.75020: starting attempt loop 44071 1727204643.75026: running the handler 44071 1727204643.75057: variable 'lsr_setup' from source: include params 44071 1727204643.75151: variable 'lsr_setup' from source: include params 44071 1727204643.75207: handler run complete 44071 1727204643.75232: attempt loop complete, returning result 44071 1727204643.75256: variable 'item' from source: unknown 44071 1727204643.75345: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml" ] } 44071 1727204643.75657: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.75660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.75663: variable 'omit' from source: magic vars 44071 1727204643.75749: variable 'ansible_distribution_major_version' from source: facts 44071 1727204643.75762: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204643.75777: variable 'omit' from source: magic vars 44071 1727204643.75796: variable 'omit' from source: magic vars 44071 1727204643.75849: variable 'item' from source: unknown 44071 1727204643.75983: variable 'item' from source: unknown 44071 1727204643.75986: variable 'omit' from source: magic vars 44071 1727204643.75992: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204643.76005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.76017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.76037: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204643.76044: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.76051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.76381: Set connection var ansible_connection to ssh 44071 1727204643.76386: Set connection var ansible_timeout to 10 44071 1727204643.76389: Set connection var ansible_pipelining to False 44071 1727204643.76391: Set connection var ansible_shell_type to sh 44071 1727204643.76394: Set connection var ansible_shell_executable to /bin/sh 44071 1727204643.76396: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204643.76398: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.76400: variable 'ansible_connection' from source: unknown 44071 1727204643.76402: variable 'ansible_module_compression' from source: unknown 44071 1727204643.76404: variable 'ansible_shell_type' from source: unknown 44071 1727204643.76406: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.76408: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.76410: variable 'ansible_pipelining' from source: unknown 44071 1727204643.76413: variable 'ansible_timeout' from source: unknown 44071 1727204643.76415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.76432: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204643.76448: variable 'omit' from source: magic vars 44071 1727204643.76456: starting attempt loop 44071 1727204643.76463: running the handler 44071 1727204643.76491: variable 'lsr_test' from source: include params 44071 1727204643.76582: variable 'lsr_test' from source: include params 44071 1727204643.76607: handler run complete 44071 1727204643.76626: attempt loop complete, returning result 44071 1727204643.76656: variable 'item' from source: unknown 44071 1727204643.76732: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove_profile.yml" ] } 44071 1727204643.77073: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.77076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.77079: variable 'omit' from source: magic vars 44071 1727204643.77141: variable 'ansible_distribution_major_version' from source: facts 44071 1727204643.77154: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204643.77164: variable 'omit' from source: magic vars 44071 1727204643.77191: variable 'omit' from source: magic vars 44071 1727204643.77240: variable 'item' from source: unknown 44071 1727204643.77320: variable 'item' from source: unknown 44071 1727204643.77343: variable 'omit' from source: magic vars 44071 1727204643.77371: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204643.77383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.77392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.77414: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204643.77420: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.77426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.77512: Set connection var ansible_connection to ssh 44071 1727204643.77570: Set connection var ansible_timeout to 10 44071 1727204643.77573: Set connection var ansible_pipelining to False 44071 1727204643.77575: Set connection var ansible_shell_type to sh 44071 1727204643.77577: Set connection var ansible_shell_executable to /bin/sh 44071 1727204643.77579: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204643.77581: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.77587: variable 'ansible_connection' from source: unknown 44071 1727204643.77592: variable 'ansible_module_compression' from source: unknown 44071 1727204643.77597: variable 'ansible_shell_type' from source: unknown 44071 1727204643.77602: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.77607: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.77619: variable 'ansible_pipelining' from source: unknown 44071 1727204643.77625: variable 'ansible_timeout' from source: unknown 44071 1727204643.77631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.78054: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204643.78058: variable 'omit' from source: magic vars 44071 1727204643.78061: starting attempt loop 44071 1727204643.78063: running the handler 44071 1727204643.78067: variable 'lsr_assert' from source: include params 44071 1727204643.78104: variable 'lsr_assert' from source: include params 44071 1727204643.78132: handler run complete 44071 1727204643.78179: attempt loop complete, returning result 44071 1727204643.78473: variable 'item' from source: unknown 44071 1727204643.78478: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_absent.yml" ] } 44071 1727204643.78871: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.78875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.78877: variable 'omit' from source: magic vars 44071 1727204643.79251: variable 'ansible_distribution_major_version' from source: facts 44071 1727204643.79371: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204643.79381: variable 'omit' from source: magic vars 44071 1727204643.79406: variable 'omit' from source: magic vars 44071 1727204643.79460: variable 'item' from source: unknown 44071 1727204643.79683: variable 'item' from source: unknown 44071 1727204643.79706: variable 'omit' from source: magic vars 44071 1727204643.79900: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204643.79905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.79907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.79909: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204643.79912: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.79914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.80038: Set connection var ansible_connection to ssh 44071 1727204643.80226: Set connection var ansible_timeout to 10 44071 1727204643.80229: Set connection var ansible_pipelining to False 44071 1727204643.80231: Set connection var ansible_shell_type to sh 44071 1727204643.80236: Set connection var ansible_shell_executable to /bin/sh 44071 1727204643.80238: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204643.80240: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.80242: variable 'ansible_connection' from source: unknown 44071 1727204643.80244: variable 'ansible_module_compression' from source: unknown 44071 1727204643.80246: variable 'ansible_shell_type' from source: unknown 44071 1727204643.80248: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.80250: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.80251: variable 'ansible_pipelining' from source: unknown 44071 1727204643.80253: variable 'ansible_timeout' from source: unknown 44071 1727204643.80255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.80554: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204643.80573: variable 'omit' from source: magic vars 44071 1727204643.80583: starting attempt loop 44071 1727204643.80589: running the handler 44071 1727204643.80842: handler run complete 44071 1727204643.80900: attempt loop complete, returning result 44071 1727204643.80924: variable 'item' from source: unknown 44071 1727204643.81069: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 44071 1727204643.81491: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.81494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.81497: variable 'omit' from source: magic vars 44071 1727204643.81672: variable 'ansible_distribution_major_version' from source: facts 44071 1727204643.81675: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204643.81678: variable 'omit' from source: magic vars 44071 1727204643.81680: variable 'omit' from source: magic vars 44071 1727204643.81709: variable 'item' from source: unknown 44071 1727204643.81796: variable 'item' from source: unknown 44071 1727204643.81818: variable 'omit' from source: magic vars 44071 1727204643.81856: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204643.81943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.81946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.81949: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204643.81951: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.81953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.82010: Set connection var ansible_connection to ssh 44071 1727204643.82024: Set connection var ansible_timeout to 10 44071 1727204643.82038: Set connection var ansible_pipelining to False 44071 1727204643.82058: Set connection var ansible_shell_type to sh 44071 1727204643.82071: Set connection var ansible_shell_executable to /bin/sh 44071 1727204643.82085: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204643.82113: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.82122: variable 'ansible_connection' from source: unknown 44071 1727204643.82129: variable 'ansible_module_compression' from source: unknown 44071 1727204643.82140: variable 'ansible_shell_type' from source: unknown 44071 1727204643.82158: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.82161: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.82268: variable 'ansible_pipelining' from source: unknown 44071 1727204643.82273: variable 'ansible_timeout' from source: unknown 44071 1727204643.82275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.82307: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204643.82322: variable 'omit' from source: magic vars 44071 1727204643.82331: starting attempt loop 44071 1727204643.82343: running the handler 44071 1727204643.82376: variable 'lsr_fail_debug' from source: play vars 44071 1727204643.82462: variable 'lsr_fail_debug' from source: play vars 44071 1727204643.82494: handler run complete 44071 1727204643.82512: attempt loop complete, returning result 44071 1727204643.82532: variable 'item' from source: unknown 44071 1727204643.82614: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 44071 1727204643.82806: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.82820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.82838: variable 'omit' from source: magic vars 44071 1727204643.83131: variable 'ansible_distribution_major_version' from source: facts 44071 1727204643.83136: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204643.83138: variable 'omit' from source: magic vars 44071 1727204643.83140: variable 'omit' from source: magic vars 44071 1727204643.83142: variable 'item' from source: unknown 44071 1727204643.83184: variable 'item' from source: unknown 44071 1727204643.83203: variable 'omit' from source: magic vars 44071 1727204643.83226: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204643.83253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.83283: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204643.83302: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204643.83310: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.83318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.83408: Set connection var ansible_connection to ssh 44071 1727204643.83418: Set connection var ansible_timeout to 10 44071 1727204643.83426: Set connection var ansible_pipelining to False 44071 1727204643.83438: Set connection var ansible_shell_type to sh 44071 1727204643.83475: Set connection var ansible_shell_executable to /bin/sh 44071 1727204643.83488: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204643.83514: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.83522: variable 'ansible_connection' from source: unknown 44071 1727204643.83529: variable 'ansible_module_compression' from source: unknown 44071 1727204643.83564: variable 'ansible_shell_type' from source: unknown 44071 1727204643.83569: variable 'ansible_shell_executable' from source: unknown 44071 1727204643.83572: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.83574: variable 'ansible_pipelining' from source: unknown 44071 1727204643.83576: variable 'ansible_timeout' from source: unknown 44071 1727204643.83578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.83786: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204643.83789: variable 'omit' from source: magic vars 44071 1727204643.83791: starting attempt loop 44071 1727204643.83794: running the handler 44071 1727204643.83796: variable 'lsr_cleanup' from source: include params 44071 1727204643.83826: variable 'lsr_cleanup' from source: include params 44071 1727204643.83854: handler run complete 44071 1727204643.83877: attempt loop complete, returning result 44071 1727204643.83906: variable 'item' from source: unknown 44071 1727204643.83983: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 44071 1727204643.84172: dumping result to json 44071 1727204643.84176: done dumping result, returning 44071 1727204643.84178: done running TaskExecutor() for managed-node2/TASK: Show item [127b8e07-fff9-c964-7471-000000001006] 44071 1727204643.84181: sending task result for task 127b8e07-fff9-c964-7471-000000001006 44071 1727204643.84279: done sending task result for task 127b8e07-fff9-c964-7471-000000001006 44071 1727204643.84282: WORKER PROCESS EXITING 44071 1727204643.84384: no more pending results, returning what we have 44071 1727204643.84388: results queue empty 44071 1727204643.84389: checking for any_errors_fatal 44071 1727204643.84401: done checking for any_errors_fatal 44071 1727204643.84402: checking for max_fail_percentage 44071 1727204643.84403: done checking for max_fail_percentage 44071 1727204643.84404: checking to see if all hosts have failed and the running result is not ok 44071 1727204643.84405: done checking to see if all hosts have failed 44071 1727204643.84406: getting the remaining hosts for this loop 44071 1727204643.84407: done getting the remaining hosts for this loop 44071 1727204643.84413: getting the next task for host managed-node2 44071 1727204643.84421: done getting next task for host managed-node2 44071 1727204643.84424: ^ task is: TASK: Include the task 'show_interfaces.yml' 44071 1727204643.84427: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204643.84431: getting variables 44071 1727204643.84435: in VariableManager get_vars() 44071 1727204643.84476: Calling all_inventory to load vars for managed-node2 44071 1727204643.84480: Calling groups_inventory to load vars for managed-node2 44071 1727204643.84484: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204643.84499: Calling all_plugins_play to load vars for managed-node2 44071 1727204643.84503: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204643.84507: Calling groups_plugins_play to load vars for managed-node2 44071 1727204643.86864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204643.90228: done with get_vars() 44071 1727204643.90276: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 15:04:03 -0400 (0:00:00.200) 0:00:56.222 ***** 44071 1727204643.90564: entering _queue_task() for managed-node2/include_tasks 44071 1727204643.91547: worker is 1 (out of 1 available) 44071 1727204643.91563: exiting _queue_task() for managed-node2/include_tasks 44071 1727204643.91580: done queuing things up, now waiting for results queue to drain 44071 1727204643.91582: waiting for pending results... 44071 1727204643.92086: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 44071 1727204643.92289: in run() - task 127b8e07-fff9-c964-7471-000000001007 44071 1727204643.92375: variable 'ansible_search_path' from source: unknown 44071 1727204643.92379: variable 'ansible_search_path' from source: unknown 44071 1727204643.92467: calling self._execute() 44071 1727204643.92646: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204643.92651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204643.92665: variable 'omit' from source: magic vars 44071 1727204643.93625: variable 'ansible_distribution_major_version' from source: facts 44071 1727204643.93702: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204643.93706: _execute() done 44071 1727204643.93709: dumping result to json 44071 1727204643.93712: done dumping result, returning 44071 1727204643.93715: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-c964-7471-000000001007] 44071 1727204643.93717: sending task result for task 127b8e07-fff9-c964-7471-000000001007 44071 1727204643.93890: done sending task result for task 127b8e07-fff9-c964-7471-000000001007 44071 1727204643.93894: WORKER PROCESS EXITING 44071 1727204643.93946: no more pending results, returning what we have 44071 1727204643.93952: in VariableManager get_vars() 44071 1727204643.94000: Calling all_inventory to load vars for managed-node2 44071 1727204643.94003: Calling groups_inventory to load vars for managed-node2 44071 1727204643.94007: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204643.94138: Calling all_plugins_play to load vars for managed-node2 44071 1727204643.94143: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204643.94148: Calling groups_plugins_play to load vars for managed-node2 44071 1727204643.97924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204644.01748: done with get_vars() 44071 1727204644.01781: variable 'ansible_search_path' from source: unknown 44071 1727204644.01783: variable 'ansible_search_path' from source: unknown 44071 1727204644.01830: we have included files to process 44071 1727204644.01831: generating all_blocks data 44071 1727204644.01834: done generating all_blocks data 44071 1727204644.01841: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204644.01842: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204644.01845: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204644.01960: in VariableManager get_vars() 44071 1727204644.01984: done with get_vars() 44071 1727204644.02157: done processing included file 44071 1727204644.02160: iterating over new_blocks loaded from include file 44071 1727204644.02161: in VariableManager get_vars() 44071 1727204644.02291: done with get_vars() 44071 1727204644.02293: filtering new block on tags 44071 1727204644.02334: done filtering new block on tags 44071 1727204644.02338: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 44071 1727204644.02343: extending task lists for all hosts with included blocks 44071 1727204644.03102: done extending task lists 44071 1727204644.03105: done processing included files 44071 1727204644.03105: results queue empty 44071 1727204644.03106: checking for any_errors_fatal 44071 1727204644.03114: done checking for any_errors_fatal 44071 1727204644.03115: checking for max_fail_percentage 44071 1727204644.03116: done checking for max_fail_percentage 44071 1727204644.03117: checking to see if all hosts have failed and the running result is not ok 44071 1727204644.03118: done checking to see if all hosts have failed 44071 1727204644.03119: getting the remaining hosts for this loop 44071 1727204644.03120: done getting the remaining hosts for this loop 44071 1727204644.03123: getting the next task for host managed-node2 44071 1727204644.03128: done getting next task for host managed-node2 44071 1727204644.03130: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 44071 1727204644.03133: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204644.03136: getting variables 44071 1727204644.03137: in VariableManager get_vars() 44071 1727204644.03152: Calling all_inventory to load vars for managed-node2 44071 1727204644.03155: Calling groups_inventory to load vars for managed-node2 44071 1727204644.03157: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204644.03164: Calling all_plugins_play to load vars for managed-node2 44071 1727204644.03168: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204644.03172: Calling groups_plugins_play to load vars for managed-node2 44071 1727204644.04955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204644.07867: done with get_vars() 44071 1727204644.08117: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:04:04 -0400 (0:00:00.176) 0:00:56.398 ***** 44071 1727204644.08212: entering _queue_task() for managed-node2/include_tasks 44071 1727204644.09038: worker is 1 (out of 1 available) 44071 1727204644.09053: exiting _queue_task() for managed-node2/include_tasks 44071 1727204644.09274: done queuing things up, now waiting for results queue to drain 44071 1727204644.09276: waiting for pending results... 44071 1727204644.09929: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 44071 1727204644.10175: in run() - task 127b8e07-fff9-c964-7471-00000000102e 44071 1727204644.10190: variable 'ansible_search_path' from source: unknown 44071 1727204644.10209: variable 'ansible_search_path' from source: unknown 44071 1727204644.10313: calling self._execute() 44071 1727204644.10574: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204644.10578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204644.10581: variable 'omit' from source: magic vars 44071 1727204644.12157: variable 'ansible_distribution_major_version' from source: facts 44071 1727204644.12180: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204644.12192: _execute() done 44071 1727204644.12201: dumping result to json 44071 1727204644.12250: done dumping result, returning 44071 1727204644.12264: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-c964-7471-00000000102e] 44071 1727204644.12276: sending task result for task 127b8e07-fff9-c964-7471-00000000102e 44071 1727204644.12434: no more pending results, returning what we have 44071 1727204644.12440: in VariableManager get_vars() 44071 1727204644.12491: Calling all_inventory to load vars for managed-node2 44071 1727204644.12494: Calling groups_inventory to load vars for managed-node2 44071 1727204644.12498: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204644.12514: Calling all_plugins_play to load vars for managed-node2 44071 1727204644.12516: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204644.12519: Calling groups_plugins_play to load vars for managed-node2 44071 1727204644.13186: done sending task result for task 127b8e07-fff9-c964-7471-00000000102e 44071 1727204644.13192: WORKER PROCESS EXITING 44071 1727204644.16426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204644.20837: done with get_vars() 44071 1727204644.20879: variable 'ansible_search_path' from source: unknown 44071 1727204644.20881: variable 'ansible_search_path' from source: unknown 44071 1727204644.20925: we have included files to process 44071 1727204644.20926: generating all_blocks data 44071 1727204644.20928: done generating all_blocks data 44071 1727204644.20929: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204644.20930: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204644.20933: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204644.21652: done processing included file 44071 1727204644.21655: iterating over new_blocks loaded from include file 44071 1727204644.21657: in VariableManager get_vars() 44071 1727204644.21681: done with get_vars() 44071 1727204644.21683: filtering new block on tags 44071 1727204644.21725: done filtering new block on tags 44071 1727204644.21727: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 44071 1727204644.21733: extending task lists for all hosts with included blocks 44071 1727204644.22124: done extending task lists 44071 1727204644.22126: done processing included files 44071 1727204644.22126: results queue empty 44071 1727204644.22127: checking for any_errors_fatal 44071 1727204644.22131: done checking for any_errors_fatal 44071 1727204644.22131: checking for max_fail_percentage 44071 1727204644.22133: done checking for max_fail_percentage 44071 1727204644.22134: checking to see if all hosts have failed and the running result is not ok 44071 1727204644.22134: done checking to see if all hosts have failed 44071 1727204644.22135: getting the remaining hosts for this loop 44071 1727204644.22137: done getting the remaining hosts for this loop 44071 1727204644.22139: getting the next task for host managed-node2 44071 1727204644.22144: done getting next task for host managed-node2 44071 1727204644.22147: ^ task is: TASK: Gather current interface info 44071 1727204644.22151: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204644.22153: getting variables 44071 1727204644.22154: in VariableManager get_vars() 44071 1727204644.22373: Calling all_inventory to load vars for managed-node2 44071 1727204644.22376: Calling groups_inventory to load vars for managed-node2 44071 1727204644.22379: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204644.22387: Calling all_plugins_play to load vars for managed-node2 44071 1727204644.22390: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204644.22393: Calling groups_plugins_play to load vars for managed-node2 44071 1727204644.25602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204644.43995: done with get_vars() 44071 1727204644.44039: done getting variables 44071 1727204644.44095: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:04:04 -0400 (0:00:00.359) 0:00:56.757 ***** 44071 1727204644.44129: entering _queue_task() for managed-node2/command 44071 1727204644.44948: worker is 1 (out of 1 available) 44071 1727204644.45168: exiting _queue_task() for managed-node2/command 44071 1727204644.45184: done queuing things up, now waiting for results queue to drain 44071 1727204644.45187: waiting for pending results... 44071 1727204644.45651: running TaskExecutor() for managed-node2/TASK: Gather current interface info 44071 1727204644.46178: in run() - task 127b8e07-fff9-c964-7471-000000001069 44071 1727204644.46184: variable 'ansible_search_path' from source: unknown 44071 1727204644.46188: variable 'ansible_search_path' from source: unknown 44071 1727204644.46197: calling self._execute() 44071 1727204644.46480: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204644.46505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204644.46523: variable 'omit' from source: magic vars 44071 1727204644.47534: variable 'ansible_distribution_major_version' from source: facts 44071 1727204644.47538: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204644.47541: variable 'omit' from source: magic vars 44071 1727204644.47654: variable 'omit' from source: magic vars 44071 1727204644.47877: variable 'omit' from source: magic vars 44071 1727204644.47924: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204644.47976: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204644.48104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204644.48136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204644.48253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204644.48362: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204644.48367: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204644.48370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204644.48641: Set connection var ansible_connection to ssh 44071 1727204644.48663: Set connection var ansible_timeout to 10 44071 1727204644.48677: Set connection var ansible_pipelining to False 44071 1727204644.48701: Set connection var ansible_shell_type to sh 44071 1727204644.48801: Set connection var ansible_shell_executable to /bin/sh 44071 1727204644.48805: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204644.48871: variable 'ansible_shell_executable' from source: unknown 44071 1727204644.48875: variable 'ansible_connection' from source: unknown 44071 1727204644.48878: variable 'ansible_module_compression' from source: unknown 44071 1727204644.48886: variable 'ansible_shell_type' from source: unknown 44071 1727204644.48894: variable 'ansible_shell_executable' from source: unknown 44071 1727204644.48904: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204644.48916: variable 'ansible_pipelining' from source: unknown 44071 1727204644.49021: variable 'ansible_timeout' from source: unknown 44071 1727204644.49025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204644.49320: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204644.49458: variable 'omit' from source: magic vars 44071 1727204644.49461: starting attempt loop 44071 1727204644.49464: running the handler 44071 1727204644.49470: _low_level_execute_command(): starting 44071 1727204644.49566: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204644.51101: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204644.51121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204644.51287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204644.51322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204644.51390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204644.51484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204644.53252: stdout chunk (state=3): >>>/root <<< 44071 1727204644.53445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204644.53521: stderr chunk (state=3): >>><<< 44071 1727204644.53555: stdout chunk (state=3): >>><<< 44071 1727204644.53656: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204644.53681: _low_level_execute_command(): starting 44071 1727204644.53688: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204644.536597-47233-143986741160089 `" && echo ansible-tmp-1727204644.536597-47233-143986741160089="` echo /root/.ansible/tmp/ansible-tmp-1727204644.536597-47233-143986741160089 `" ) && sleep 0' 44071 1727204644.54383: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204644.54416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204644.54420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204644.54423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204644.54526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204644.54530: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204644.54546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204644.54549: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204644.54552: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204644.54572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204644.54671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204644.54677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204644.54851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204644.56900: stdout chunk (state=3): >>>ansible-tmp-1727204644.536597-47233-143986741160089=/root/.ansible/tmp/ansible-tmp-1727204644.536597-47233-143986741160089 <<< 44071 1727204644.57027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204644.57034: stdout chunk (state=3): >>><<< 44071 1727204644.57047: stderr chunk (state=3): >>><<< 44071 1727204644.57075: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204644.536597-47233-143986741160089=/root/.ansible/tmp/ansible-tmp-1727204644.536597-47233-143986741160089 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204644.57172: variable 'ansible_module_compression' from source: unknown 44071 1727204644.57393: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204644.57440: variable 'ansible_facts' from source: unknown 44071 1727204644.57726: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204644.536597-47233-143986741160089/AnsiballZ_command.py 44071 1727204644.58152: Sending initial data 44071 1727204644.58155: Sent initial data (155 bytes) 44071 1727204644.59912: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204644.60003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204644.60146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204644.60150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204644.60177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204644.61801: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204644.61843: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204644.61920: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204644.62019: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpd_l6csjf /root/.ansible/tmp/ansible-tmp-1727204644.536597-47233-143986741160089/AnsiballZ_command.py <<< 44071 1727204644.62029: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204644.536597-47233-143986741160089/AnsiballZ_command.py" <<< 44071 1727204644.62082: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpd_l6csjf" to remote "/root/.ansible/tmp/ansible-tmp-1727204644.536597-47233-143986741160089/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204644.536597-47233-143986741160089/AnsiballZ_command.py" <<< 44071 1727204644.63438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204644.63573: stderr chunk (state=3): >>><<< 44071 1727204644.63583: stdout chunk (state=3): >>><<< 44071 1727204644.63587: done transferring module to remote 44071 1727204644.63594: _low_level_execute_command(): starting 44071 1727204644.63597: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204644.536597-47233-143986741160089/ /root/.ansible/tmp/ansible-tmp-1727204644.536597-47233-143986741160089/AnsiballZ_command.py && sleep 0' 44071 1727204644.64075: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204644.64079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204644.64082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204644.64089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204644.64135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204644.64140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204644.64154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204644.64229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204644.74678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204644.74733: stderr chunk (state=3): >>><<< 44071 1727204644.74737: stdout chunk (state=3): >>><<< 44071 1727204644.74751: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204644.74754: _low_level_execute_command(): starting 44071 1727204644.74757: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204644.536597-47233-143986741160089/AnsiballZ_command.py && sleep 0' 44071 1727204644.75238: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204644.75242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204644.75278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204644.75282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204644.75284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204644.75286: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204644.75345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204644.75349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204644.75352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204644.75434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204644.92572: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:04:04.920367", "end": "2024-09-24 15:04:04.924246", "delta": "0:00:00.003879", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204644.94152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204644.94217: stderr chunk (state=3): >>><<< 44071 1727204644.94221: stdout chunk (state=3): >>><<< 44071 1727204644.94240: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:04:04.920367", "end": "2024-09-24 15:04:04.924246", "delta": "0:00:00.003879", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204644.94282: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204644.536597-47233-143986741160089/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204644.94289: _low_level_execute_command(): starting 44071 1727204644.94295: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204644.536597-47233-143986741160089/ > /dev/null 2>&1 && sleep 0' 44071 1727204644.94769: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204644.94774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204644.94803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204644.94807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204644.94809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204644.94874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204644.94878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204644.94884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204644.94955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204644.96901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204644.96962: stderr chunk (state=3): >>><<< 44071 1727204644.96966: stdout chunk (state=3): >>><<< 44071 1727204644.96983: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204644.96990: handler run complete 44071 1727204644.97011: Evaluated conditional (False): False 44071 1727204644.97020: attempt loop complete, returning result 44071 1727204644.97023: _execute() done 44071 1727204644.97026: dumping result to json 44071 1727204644.97034: done dumping result, returning 44071 1727204644.97042: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [127b8e07-fff9-c964-7471-000000001069] 44071 1727204644.97047: sending task result for task 127b8e07-fff9-c964-7471-000000001069 44071 1727204644.97171: done sending task result for task 127b8e07-fff9-c964-7471-000000001069 44071 1727204644.97174: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003879", "end": "2024-09-24 15:04:04.924246", "rc": 0, "start": "2024-09-24 15:04:04.920367" } STDOUT: bonding_masters eth0 lo 44071 1727204644.97257: no more pending results, returning what we have 44071 1727204644.97259: results queue empty 44071 1727204644.97261: checking for any_errors_fatal 44071 1727204644.97262: done checking for any_errors_fatal 44071 1727204644.97263: checking for max_fail_percentage 44071 1727204644.97265: done checking for max_fail_percentage 44071 1727204644.97274: checking to see if all hosts have failed and the running result is not ok 44071 1727204644.97275: done checking to see if all hosts have failed 44071 1727204644.97276: getting the remaining hosts for this loop 44071 1727204644.97278: done getting the remaining hosts for this loop 44071 1727204644.97283: getting the next task for host managed-node2 44071 1727204644.97291: done getting next task for host managed-node2 44071 1727204644.97294: ^ task is: TASK: Set current_interfaces 44071 1727204644.97300: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204644.97304: getting variables 44071 1727204644.97305: in VariableManager get_vars() 44071 1727204644.97342: Calling all_inventory to load vars for managed-node2 44071 1727204644.97345: Calling groups_inventory to load vars for managed-node2 44071 1727204644.97348: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204644.97359: Calling all_plugins_play to load vars for managed-node2 44071 1727204644.97362: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204644.97368: Calling groups_plugins_play to load vars for managed-node2 44071 1727204644.98399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204644.99651: done with get_vars() 44071 1727204644.99684: done getting variables 44071 1727204644.99734: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:04:04 -0400 (0:00:00.556) 0:00:57.314 ***** 44071 1727204644.99760: entering _queue_task() for managed-node2/set_fact 44071 1727204645.00060: worker is 1 (out of 1 available) 44071 1727204645.00075: exiting _queue_task() for managed-node2/set_fact 44071 1727204645.00090: done queuing things up, now waiting for results queue to drain 44071 1727204645.00092: waiting for pending results... 44071 1727204645.00303: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 44071 1727204645.00415: in run() - task 127b8e07-fff9-c964-7471-00000000106a 44071 1727204645.00429: variable 'ansible_search_path' from source: unknown 44071 1727204645.00435: variable 'ansible_search_path' from source: unknown 44071 1727204645.00474: calling self._execute() 44071 1727204645.00572: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.00578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.00588: variable 'omit' from source: magic vars 44071 1727204645.00908: variable 'ansible_distribution_major_version' from source: facts 44071 1727204645.00919: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204645.00925: variable 'omit' from source: magic vars 44071 1727204645.00974: variable 'omit' from source: magic vars 44071 1727204645.01063: variable '_current_interfaces' from source: set_fact 44071 1727204645.01117: variable 'omit' from source: magic vars 44071 1727204645.01155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204645.01187: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204645.01207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204645.01222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204645.01233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204645.01261: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204645.01267: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.01270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.01349: Set connection var ansible_connection to ssh 44071 1727204645.01354: Set connection var ansible_timeout to 10 44071 1727204645.01360: Set connection var ansible_pipelining to False 44071 1727204645.01367: Set connection var ansible_shell_type to sh 44071 1727204645.01373: Set connection var ansible_shell_executable to /bin/sh 44071 1727204645.01380: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204645.01398: variable 'ansible_shell_executable' from source: unknown 44071 1727204645.01402: variable 'ansible_connection' from source: unknown 44071 1727204645.01405: variable 'ansible_module_compression' from source: unknown 44071 1727204645.01408: variable 'ansible_shell_type' from source: unknown 44071 1727204645.01413: variable 'ansible_shell_executable' from source: unknown 44071 1727204645.01415: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.01418: variable 'ansible_pipelining' from source: unknown 44071 1727204645.01421: variable 'ansible_timeout' from source: unknown 44071 1727204645.01423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.01546: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204645.01550: variable 'omit' from source: magic vars 44071 1727204645.01557: starting attempt loop 44071 1727204645.01561: running the handler 44071 1727204645.01574: handler run complete 44071 1727204645.01583: attempt loop complete, returning result 44071 1727204645.01586: _execute() done 44071 1727204645.01588: dumping result to json 44071 1727204645.01593: done dumping result, returning 44071 1727204645.01600: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [127b8e07-fff9-c964-7471-00000000106a] 44071 1727204645.01604: sending task result for task 127b8e07-fff9-c964-7471-00000000106a ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 44071 1727204645.01774: no more pending results, returning what we have 44071 1727204645.01777: results queue empty 44071 1727204645.01778: checking for any_errors_fatal 44071 1727204645.01792: done checking for any_errors_fatal 44071 1727204645.01792: checking for max_fail_percentage 44071 1727204645.01794: done checking for max_fail_percentage 44071 1727204645.01795: checking to see if all hosts have failed and the running result is not ok 44071 1727204645.01796: done checking to see if all hosts have failed 44071 1727204645.01797: getting the remaining hosts for this loop 44071 1727204645.01799: done getting the remaining hosts for this loop 44071 1727204645.01803: getting the next task for host managed-node2 44071 1727204645.01812: done getting next task for host managed-node2 44071 1727204645.01814: ^ task is: TASK: Show current_interfaces 44071 1727204645.01818: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204645.01822: getting variables 44071 1727204645.01824: in VariableManager get_vars() 44071 1727204645.01859: Calling all_inventory to load vars for managed-node2 44071 1727204645.01862: Calling groups_inventory to load vars for managed-node2 44071 1727204645.01873: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204645.01880: done sending task result for task 127b8e07-fff9-c964-7471-00000000106a 44071 1727204645.01882: WORKER PROCESS EXITING 44071 1727204645.01893: Calling all_plugins_play to load vars for managed-node2 44071 1727204645.01896: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204645.01899: Calling groups_plugins_play to load vars for managed-node2 44071 1727204645.03080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204645.04860: done with get_vars() 44071 1727204645.04893: done getting variables 44071 1727204645.04944: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:04:05 -0400 (0:00:00.052) 0:00:57.366 ***** 44071 1727204645.04972: entering _queue_task() for managed-node2/debug 44071 1727204645.05267: worker is 1 (out of 1 available) 44071 1727204645.05284: exiting _queue_task() for managed-node2/debug 44071 1727204645.05298: done queuing things up, now waiting for results queue to drain 44071 1727204645.05300: waiting for pending results... 44071 1727204645.05531: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 44071 1727204645.05641: in run() - task 127b8e07-fff9-c964-7471-00000000102f 44071 1727204645.05657: variable 'ansible_search_path' from source: unknown 44071 1727204645.05661: variable 'ansible_search_path' from source: unknown 44071 1727204645.05701: calling self._execute() 44071 1727204645.05800: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.05807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.05816: variable 'omit' from source: magic vars 44071 1727204645.06135: variable 'ansible_distribution_major_version' from source: facts 44071 1727204645.06149: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204645.06155: variable 'omit' from source: magic vars 44071 1727204645.06195: variable 'omit' from source: magic vars 44071 1727204645.06274: variable 'current_interfaces' from source: set_fact 44071 1727204645.06300: variable 'omit' from source: magic vars 44071 1727204645.06341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204645.06372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204645.06391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204645.06406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204645.06421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204645.06448: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204645.06451: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.06455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.06529: Set connection var ansible_connection to ssh 44071 1727204645.06539: Set connection var ansible_timeout to 10 44071 1727204645.06545: Set connection var ansible_pipelining to False 44071 1727204645.06550: Set connection var ansible_shell_type to sh 44071 1727204645.06556: Set connection var ansible_shell_executable to /bin/sh 44071 1727204645.06564: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204645.06584: variable 'ansible_shell_executable' from source: unknown 44071 1727204645.06588: variable 'ansible_connection' from source: unknown 44071 1727204645.06590: variable 'ansible_module_compression' from source: unknown 44071 1727204645.06593: variable 'ansible_shell_type' from source: unknown 44071 1727204645.06595: variable 'ansible_shell_executable' from source: unknown 44071 1727204645.06598: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.06602: variable 'ansible_pipelining' from source: unknown 44071 1727204645.06605: variable 'ansible_timeout' from source: unknown 44071 1727204645.06609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.06725: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204645.06737: variable 'omit' from source: magic vars 44071 1727204645.06742: starting attempt loop 44071 1727204645.06745: running the handler 44071 1727204645.06790: handler run complete 44071 1727204645.06802: attempt loop complete, returning result 44071 1727204645.06805: _execute() done 44071 1727204645.06807: dumping result to json 44071 1727204645.06810: done dumping result, returning 44071 1727204645.06819: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [127b8e07-fff9-c964-7471-00000000102f] 44071 1727204645.06822: sending task result for task 127b8e07-fff9-c964-7471-00000000102f 44071 1727204645.06917: done sending task result for task 127b8e07-fff9-c964-7471-00000000102f 44071 1727204645.06920: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 44071 1727204645.06973: no more pending results, returning what we have 44071 1727204645.06976: results queue empty 44071 1727204645.06977: checking for any_errors_fatal 44071 1727204645.06988: done checking for any_errors_fatal 44071 1727204645.06988: checking for max_fail_percentage 44071 1727204645.06990: done checking for max_fail_percentage 44071 1727204645.06991: checking to see if all hosts have failed and the running result is not ok 44071 1727204645.06992: done checking to see if all hosts have failed 44071 1727204645.06992: getting the remaining hosts for this loop 44071 1727204645.06994: done getting the remaining hosts for this loop 44071 1727204645.06999: getting the next task for host managed-node2 44071 1727204645.07007: done getting next task for host managed-node2 44071 1727204645.07011: ^ task is: TASK: Setup 44071 1727204645.07014: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204645.07018: getting variables 44071 1727204645.07020: in VariableManager get_vars() 44071 1727204645.07058: Calling all_inventory to load vars for managed-node2 44071 1727204645.07061: Calling groups_inventory to load vars for managed-node2 44071 1727204645.07064: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204645.07084: Calling all_plugins_play to load vars for managed-node2 44071 1727204645.07088: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204645.07090: Calling groups_plugins_play to load vars for managed-node2 44071 1727204645.08325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204645.10412: done with get_vars() 44071 1727204645.10453: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 15:04:05 -0400 (0:00:00.055) 0:00:57.422 ***** 44071 1727204645.10562: entering _queue_task() for managed-node2/include_tasks 44071 1727204645.10953: worker is 1 (out of 1 available) 44071 1727204645.11170: exiting _queue_task() for managed-node2/include_tasks 44071 1727204645.11182: done queuing things up, now waiting for results queue to drain 44071 1727204645.11184: waiting for pending results... 44071 1727204645.11498: running TaskExecutor() for managed-node2/TASK: Setup 44071 1727204645.11503: in run() - task 127b8e07-fff9-c964-7471-000000001008 44071 1727204645.11506: variable 'ansible_search_path' from source: unknown 44071 1727204645.11509: variable 'ansible_search_path' from source: unknown 44071 1727204645.11517: variable 'lsr_setup' from source: include params 44071 1727204645.11755: variable 'lsr_setup' from source: include params 44071 1727204645.11839: variable 'omit' from source: magic vars 44071 1727204645.11997: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.12014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.12034: variable 'omit' from source: magic vars 44071 1727204645.12310: variable 'ansible_distribution_major_version' from source: facts 44071 1727204645.12327: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204645.12337: variable 'item' from source: unknown 44071 1727204645.12417: variable 'item' from source: unknown 44071 1727204645.12462: variable 'item' from source: unknown 44071 1727204645.12531: variable 'item' from source: unknown 44071 1727204645.12894: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.12898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.12901: variable 'omit' from source: magic vars 44071 1727204645.12968: variable 'ansible_distribution_major_version' from source: facts 44071 1727204645.12980: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204645.12990: variable 'item' from source: unknown 44071 1727204645.13059: variable 'item' from source: unknown 44071 1727204645.13098: variable 'item' from source: unknown 44071 1727204645.13170: variable 'item' from source: unknown 44071 1727204645.13416: dumping result to json 44071 1727204645.13420: done dumping result, returning 44071 1727204645.13423: done running TaskExecutor() for managed-node2/TASK: Setup [127b8e07-fff9-c964-7471-000000001008] 44071 1727204645.13425: sending task result for task 127b8e07-fff9-c964-7471-000000001008 44071 1727204645.13474: done sending task result for task 127b8e07-fff9-c964-7471-000000001008 44071 1727204645.13478: WORKER PROCESS EXITING 44071 1727204645.13506: no more pending results, returning what we have 44071 1727204645.13511: in VariableManager get_vars() 44071 1727204645.13556: Calling all_inventory to load vars for managed-node2 44071 1727204645.13562: Calling groups_inventory to load vars for managed-node2 44071 1727204645.13567: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204645.13587: Calling all_plugins_play to load vars for managed-node2 44071 1727204645.13591: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204645.13595: Calling groups_plugins_play to load vars for managed-node2 44071 1727204645.15550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204645.17695: done with get_vars() 44071 1727204645.17727: variable 'ansible_search_path' from source: unknown 44071 1727204645.17729: variable 'ansible_search_path' from source: unknown 44071 1727204645.17778: variable 'ansible_search_path' from source: unknown 44071 1727204645.17780: variable 'ansible_search_path' from source: unknown 44071 1727204645.17810: we have included files to process 44071 1727204645.17811: generating all_blocks data 44071 1727204645.17813: done generating all_blocks data 44071 1727204645.17818: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 44071 1727204645.17820: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 44071 1727204645.17822: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 44071 1727204645.18068: done processing included file 44071 1727204645.18070: iterating over new_blocks loaded from include file 44071 1727204645.18072: in VariableManager get_vars() 44071 1727204645.18088: done with get_vars() 44071 1727204645.18089: filtering new block on tags 44071 1727204645.18125: done filtering new block on tags 44071 1727204645.18128: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed-node2 => (item=tasks/create_bridge_profile.yml) 44071 1727204645.18133: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 44071 1727204645.18134: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 44071 1727204645.18138: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 44071 1727204645.18238: done processing included file 44071 1727204645.18240: iterating over new_blocks loaded from include file 44071 1727204645.18241: in VariableManager get_vars() 44071 1727204645.18258: done with get_vars() 44071 1727204645.18260: filtering new block on tags 44071 1727204645.18284: done filtering new block on tags 44071 1727204645.18286: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed-node2 => (item=tasks/activate_profile.yml) 44071 1727204645.18290: extending task lists for all hosts with included blocks 44071 1727204645.18951: done extending task lists 44071 1727204645.18953: done processing included files 44071 1727204645.18954: results queue empty 44071 1727204645.18954: checking for any_errors_fatal 44071 1727204645.18959: done checking for any_errors_fatal 44071 1727204645.18960: checking for max_fail_percentage 44071 1727204645.18961: done checking for max_fail_percentage 44071 1727204645.18962: checking to see if all hosts have failed and the running result is not ok 44071 1727204645.18962: done checking to see if all hosts have failed 44071 1727204645.18963: getting the remaining hosts for this loop 44071 1727204645.18965: done getting the remaining hosts for this loop 44071 1727204645.18969: getting the next task for host managed-node2 44071 1727204645.18974: done getting next task for host managed-node2 44071 1727204645.18977: ^ task is: TASK: Include network role 44071 1727204645.18980: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204645.18983: getting variables 44071 1727204645.18984: in VariableManager get_vars() 44071 1727204645.18996: Calling all_inventory to load vars for managed-node2 44071 1727204645.19005: Calling groups_inventory to load vars for managed-node2 44071 1727204645.19008: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204645.19015: Calling all_plugins_play to load vars for managed-node2 44071 1727204645.19018: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204645.19021: Calling groups_plugins_play to load vars for managed-node2 44071 1727204645.20630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204645.22808: done with get_vars() 44071 1727204645.22848: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Tuesday 24 September 2024 15:04:05 -0400 (0:00:00.123) 0:00:57.545 ***** 44071 1727204645.22939: entering _queue_task() for managed-node2/include_role 44071 1727204645.23340: worker is 1 (out of 1 available) 44071 1727204645.23358: exiting _queue_task() for managed-node2/include_role 44071 1727204645.23375: done queuing things up, now waiting for results queue to drain 44071 1727204645.23377: waiting for pending results... 44071 1727204645.23792: running TaskExecutor() for managed-node2/TASK: Include network role 44071 1727204645.23803: in run() - task 127b8e07-fff9-c964-7471-00000000108f 44071 1727204645.23822: variable 'ansible_search_path' from source: unknown 44071 1727204645.23826: variable 'ansible_search_path' from source: unknown 44071 1727204645.23912: calling self._execute() 44071 1727204645.23989: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.23995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.24007: variable 'omit' from source: magic vars 44071 1727204645.24455: variable 'ansible_distribution_major_version' from source: facts 44071 1727204645.24563: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204645.24567: _execute() done 44071 1727204645.24571: dumping result to json 44071 1727204645.24573: done dumping result, returning 44071 1727204645.24575: done running TaskExecutor() for managed-node2/TASK: Include network role [127b8e07-fff9-c964-7471-00000000108f] 44071 1727204645.24577: sending task result for task 127b8e07-fff9-c964-7471-00000000108f 44071 1727204645.24867: done sending task result for task 127b8e07-fff9-c964-7471-00000000108f 44071 1727204645.24874: WORKER PROCESS EXITING 44071 1727204645.24899: no more pending results, returning what we have 44071 1727204645.24903: in VariableManager get_vars() 44071 1727204645.24941: Calling all_inventory to load vars for managed-node2 44071 1727204645.24944: Calling groups_inventory to load vars for managed-node2 44071 1727204645.24947: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204645.24958: Calling all_plugins_play to load vars for managed-node2 44071 1727204645.24962: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204645.24967: Calling groups_plugins_play to load vars for managed-node2 44071 1727204645.26823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204645.28948: done with get_vars() 44071 1727204645.28986: variable 'ansible_search_path' from source: unknown 44071 1727204645.28988: variable 'ansible_search_path' from source: unknown 44071 1727204645.29200: variable 'omit' from source: magic vars 44071 1727204645.29251: variable 'omit' from source: magic vars 44071 1727204645.29270: variable 'omit' from source: magic vars 44071 1727204645.29274: we have included files to process 44071 1727204645.29275: generating all_blocks data 44071 1727204645.29278: done generating all_blocks data 44071 1727204645.29279: processing included file: fedora.linux_system_roles.network 44071 1727204645.29302: in VariableManager get_vars() 44071 1727204645.29318: done with get_vars() 44071 1727204645.29349: in VariableManager get_vars() 44071 1727204645.29368: done with get_vars() 44071 1727204645.29413: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44071 1727204645.29545: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44071 1727204645.29634: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44071 1727204645.30159: in VariableManager get_vars() 44071 1727204645.30185: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204645.32253: iterating over new_blocks loaded from include file 44071 1727204645.32255: in VariableManager get_vars() 44071 1727204645.32278: done with get_vars() 44071 1727204645.32280: filtering new block on tags 44071 1727204645.32595: done filtering new block on tags 44071 1727204645.32600: in VariableManager get_vars() 44071 1727204645.32617: done with get_vars() 44071 1727204645.32619: filtering new block on tags 44071 1727204645.32638: done filtering new block on tags 44071 1727204645.32641: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 44071 1727204645.32646: extending task lists for all hosts with included blocks 44071 1727204645.32829: done extending task lists 44071 1727204645.32831: done processing included files 44071 1727204645.32832: results queue empty 44071 1727204645.32833: checking for any_errors_fatal 44071 1727204645.32837: done checking for any_errors_fatal 44071 1727204645.32838: checking for max_fail_percentage 44071 1727204645.32839: done checking for max_fail_percentage 44071 1727204645.32840: checking to see if all hosts have failed and the running result is not ok 44071 1727204645.32841: done checking to see if all hosts have failed 44071 1727204645.32842: getting the remaining hosts for this loop 44071 1727204645.32843: done getting the remaining hosts for this loop 44071 1727204645.32846: getting the next task for host managed-node2 44071 1727204645.32851: done getting next task for host managed-node2 44071 1727204645.32854: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204645.32857: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204645.32872: getting variables 44071 1727204645.32873: in VariableManager get_vars() 44071 1727204645.32888: Calling all_inventory to load vars for managed-node2 44071 1727204645.32891: Calling groups_inventory to load vars for managed-node2 44071 1727204645.32893: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204645.32899: Calling all_plugins_play to load vars for managed-node2 44071 1727204645.32902: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204645.32905: Calling groups_plugins_play to load vars for managed-node2 44071 1727204645.34595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204645.36764: done with get_vars() 44071 1727204645.36807: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:04:05 -0400 (0:00:00.139) 0:00:57.685 ***** 44071 1727204645.36910: entering _queue_task() for managed-node2/include_tasks 44071 1727204645.37337: worker is 1 (out of 1 available) 44071 1727204645.37354: exiting _queue_task() for managed-node2/include_tasks 44071 1727204645.37371: done queuing things up, now waiting for results queue to drain 44071 1727204645.37373: waiting for pending results... 44071 1727204645.37661: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204645.37842: in run() - task 127b8e07-fff9-c964-7471-0000000010f5 44071 1727204645.37868: variable 'ansible_search_path' from source: unknown 44071 1727204645.37878: variable 'ansible_search_path' from source: unknown 44071 1727204645.37932: calling self._execute() 44071 1727204645.38051: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.38068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.38085: variable 'omit' from source: magic vars 44071 1727204645.38521: variable 'ansible_distribution_major_version' from source: facts 44071 1727204645.38543: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204645.38559: _execute() done 44071 1727204645.38572: dumping result to json 44071 1727204645.38581: done dumping result, returning 44071 1727204645.38593: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-c964-7471-0000000010f5] 44071 1727204645.38603: sending task result for task 127b8e07-fff9-c964-7471-0000000010f5 44071 1727204645.38779: no more pending results, returning what we have 44071 1727204645.38786: in VariableManager get_vars() 44071 1727204645.38842: Calling all_inventory to load vars for managed-node2 44071 1727204645.38847: Calling groups_inventory to load vars for managed-node2 44071 1727204645.38849: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204645.38868: Calling all_plugins_play to load vars for managed-node2 44071 1727204645.38872: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204645.38876: Calling groups_plugins_play to load vars for managed-node2 44071 1727204645.39685: done sending task result for task 127b8e07-fff9-c964-7471-0000000010f5 44071 1727204645.39690: WORKER PROCESS EXITING 44071 1727204645.40991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204645.43153: done with get_vars() 44071 1727204645.43192: variable 'ansible_search_path' from source: unknown 44071 1727204645.43194: variable 'ansible_search_path' from source: unknown 44071 1727204645.43243: we have included files to process 44071 1727204645.43244: generating all_blocks data 44071 1727204645.43246: done generating all_blocks data 44071 1727204645.43251: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204645.43252: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204645.43255: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204645.43973: done processing included file 44071 1727204645.43975: iterating over new_blocks loaded from include file 44071 1727204645.43977: in VariableManager get_vars() 44071 1727204645.44005: done with get_vars() 44071 1727204645.44007: filtering new block on tags 44071 1727204645.44041: done filtering new block on tags 44071 1727204645.44045: in VariableManager get_vars() 44071 1727204645.44072: done with get_vars() 44071 1727204645.44074: filtering new block on tags 44071 1727204645.44122: done filtering new block on tags 44071 1727204645.44125: in VariableManager get_vars() 44071 1727204645.44149: done with get_vars() 44071 1727204645.44151: filtering new block on tags 44071 1727204645.44200: done filtering new block on tags 44071 1727204645.44203: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 44071 1727204645.44209: extending task lists for all hosts with included blocks 44071 1727204645.46217: done extending task lists 44071 1727204645.46220: done processing included files 44071 1727204645.46221: results queue empty 44071 1727204645.46221: checking for any_errors_fatal 44071 1727204645.46225: done checking for any_errors_fatal 44071 1727204645.46226: checking for max_fail_percentage 44071 1727204645.46227: done checking for max_fail_percentage 44071 1727204645.46228: checking to see if all hosts have failed and the running result is not ok 44071 1727204645.46229: done checking to see if all hosts have failed 44071 1727204645.46230: getting the remaining hosts for this loop 44071 1727204645.46231: done getting the remaining hosts for this loop 44071 1727204645.46234: getting the next task for host managed-node2 44071 1727204645.46240: done getting next task for host managed-node2 44071 1727204645.46243: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204645.46248: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204645.46261: getting variables 44071 1727204645.46263: in VariableManager get_vars() 44071 1727204645.46282: Calling all_inventory to load vars for managed-node2 44071 1727204645.46285: Calling groups_inventory to load vars for managed-node2 44071 1727204645.46287: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204645.46294: Calling all_plugins_play to load vars for managed-node2 44071 1727204645.46297: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204645.46300: Calling groups_plugins_play to load vars for managed-node2 44071 1727204645.47842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204645.49983: done with get_vars() 44071 1727204645.50025: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:04:05 -0400 (0:00:00.132) 0:00:57.817 ***** 44071 1727204645.50125: entering _queue_task() for managed-node2/setup 44071 1727204645.50548: worker is 1 (out of 1 available) 44071 1727204645.50562: exiting _queue_task() for managed-node2/setup 44071 1727204645.50682: done queuing things up, now waiting for results queue to drain 44071 1727204645.50685: waiting for pending results... 44071 1727204645.50925: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204645.51125: in run() - task 127b8e07-fff9-c964-7471-000000001152 44071 1727204645.51193: variable 'ansible_search_path' from source: unknown 44071 1727204645.51197: variable 'ansible_search_path' from source: unknown 44071 1727204645.51209: calling self._execute() 44071 1727204645.51327: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.51342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.51358: variable 'omit' from source: magic vars 44071 1727204645.51788: variable 'ansible_distribution_major_version' from source: facts 44071 1727204645.51971: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204645.52055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204645.54523: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204645.54609: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204645.54654: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204645.54697: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204645.54731: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204645.54872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204645.54876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204645.54903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204645.54958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204645.54979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204645.55046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204645.55077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204645.55107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204645.55247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204645.55251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204645.55372: variable '__network_required_facts' from source: role '' defaults 44071 1727204645.55387: variable 'ansible_facts' from source: unknown 44071 1727204645.56577: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44071 1727204645.56587: when evaluation is False, skipping this task 44071 1727204645.56594: _execute() done 44071 1727204645.56602: dumping result to json 44071 1727204645.56609: done dumping result, returning 44071 1727204645.56621: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-c964-7471-000000001152] 44071 1727204645.56630: sending task result for task 127b8e07-fff9-c964-7471-000000001152 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204645.56814: no more pending results, returning what we have 44071 1727204645.56819: results queue empty 44071 1727204645.56820: checking for any_errors_fatal 44071 1727204645.56822: done checking for any_errors_fatal 44071 1727204645.56823: checking for max_fail_percentage 44071 1727204645.56824: done checking for max_fail_percentage 44071 1727204645.56826: checking to see if all hosts have failed and the running result is not ok 44071 1727204645.56826: done checking to see if all hosts have failed 44071 1727204645.56827: getting the remaining hosts for this loop 44071 1727204645.56829: done getting the remaining hosts for this loop 44071 1727204645.56835: getting the next task for host managed-node2 44071 1727204645.56847: done getting next task for host managed-node2 44071 1727204645.56852: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204645.56859: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204645.56883: getting variables 44071 1727204645.56885: in VariableManager get_vars() 44071 1727204645.56930: Calling all_inventory to load vars for managed-node2 44071 1727204645.56933: Calling groups_inventory to load vars for managed-node2 44071 1727204645.56935: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204645.56948: Calling all_plugins_play to load vars for managed-node2 44071 1727204645.56951: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204645.56954: Calling groups_plugins_play to load vars for managed-node2 44071 1727204645.57586: done sending task result for task 127b8e07-fff9-c964-7471-000000001152 44071 1727204645.57597: WORKER PROCESS EXITING 44071 1727204645.59137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204645.61588: done with get_vars() 44071 1727204645.61633: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:04:05 -0400 (0:00:00.116) 0:00:57.933 ***** 44071 1727204645.61754: entering _queue_task() for managed-node2/stat 44071 1727204645.62373: worker is 1 (out of 1 available) 44071 1727204645.62388: exiting _queue_task() for managed-node2/stat 44071 1727204645.62401: done queuing things up, now waiting for results queue to drain 44071 1727204645.62403: waiting for pending results... 44071 1727204645.62548: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204645.62775: in run() - task 127b8e07-fff9-c964-7471-000000001154 44071 1727204645.62805: variable 'ansible_search_path' from source: unknown 44071 1727204645.62815: variable 'ansible_search_path' from source: unknown 44071 1727204645.62871: calling self._execute() 44071 1727204645.62993: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.63013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.63026: variable 'omit' from source: magic vars 44071 1727204645.63467: variable 'ansible_distribution_major_version' from source: facts 44071 1727204645.63492: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204645.63710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204645.64053: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204645.64158: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204645.64274: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204645.64537: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204645.64542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204645.64606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204645.64700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204645.64791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204645.65034: variable '__network_is_ostree' from source: set_fact 44071 1727204645.65049: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204645.65058: when evaluation is False, skipping this task 44071 1727204645.65065: _execute() done 44071 1727204645.65273: dumping result to json 44071 1727204645.65276: done dumping result, returning 44071 1727204645.65279: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-c964-7471-000000001154] 44071 1727204645.65282: sending task result for task 127b8e07-fff9-c964-7471-000000001154 44071 1727204645.65603: done sending task result for task 127b8e07-fff9-c964-7471-000000001154 44071 1727204645.65606: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204645.65672: no more pending results, returning what we have 44071 1727204645.65676: results queue empty 44071 1727204645.65678: checking for any_errors_fatal 44071 1727204645.65691: done checking for any_errors_fatal 44071 1727204645.65692: checking for max_fail_percentage 44071 1727204645.65695: done checking for max_fail_percentage 44071 1727204645.65696: checking to see if all hosts have failed and the running result is not ok 44071 1727204645.65697: done checking to see if all hosts have failed 44071 1727204645.65697: getting the remaining hosts for this loop 44071 1727204645.65700: done getting the remaining hosts for this loop 44071 1727204645.65705: getting the next task for host managed-node2 44071 1727204645.65721: done getting next task for host managed-node2 44071 1727204645.65726: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204645.65733: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204645.65759: getting variables 44071 1727204645.65761: in VariableManager get_vars() 44071 1727204645.66090: Calling all_inventory to load vars for managed-node2 44071 1727204645.66094: Calling groups_inventory to load vars for managed-node2 44071 1727204645.66097: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204645.66108: Calling all_plugins_play to load vars for managed-node2 44071 1727204645.66111: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204645.66115: Calling groups_plugins_play to load vars for managed-node2 44071 1727204645.68868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204645.70978: done with get_vars() 44071 1727204645.71008: done getting variables 44071 1727204645.71060: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:04:05 -0400 (0:00:00.093) 0:00:58.027 ***** 44071 1727204645.71102: entering _queue_task() for managed-node2/set_fact 44071 1727204645.71407: worker is 1 (out of 1 available) 44071 1727204645.71421: exiting _queue_task() for managed-node2/set_fact 44071 1727204645.71438: done queuing things up, now waiting for results queue to drain 44071 1727204645.71440: waiting for pending results... 44071 1727204645.71645: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204645.71762: in run() - task 127b8e07-fff9-c964-7471-000000001155 44071 1727204645.71778: variable 'ansible_search_path' from source: unknown 44071 1727204645.71783: variable 'ansible_search_path' from source: unknown 44071 1727204645.71816: calling self._execute() 44071 1727204645.71906: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.71912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.71921: variable 'omit' from source: magic vars 44071 1727204645.72237: variable 'ansible_distribution_major_version' from source: facts 44071 1727204645.72248: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204645.72384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204645.72605: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204645.72644: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204645.72676: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204645.72705: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204645.72781: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204645.72801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204645.72821: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204645.72841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204645.72938: variable '__network_is_ostree' from source: set_fact 44071 1727204645.72946: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204645.72949: when evaluation is False, skipping this task 44071 1727204645.72952: _execute() done 44071 1727204645.72954: dumping result to json 44071 1727204645.72958: done dumping result, returning 44071 1727204645.72969: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-000000001155] 44071 1727204645.72971: sending task result for task 127b8e07-fff9-c964-7471-000000001155 44071 1727204645.73247: done sending task result for task 127b8e07-fff9-c964-7471-000000001155 44071 1727204645.73250: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204645.73311: no more pending results, returning what we have 44071 1727204645.73314: results queue empty 44071 1727204645.73315: checking for any_errors_fatal 44071 1727204645.73321: done checking for any_errors_fatal 44071 1727204645.73321: checking for max_fail_percentage 44071 1727204645.73323: done checking for max_fail_percentage 44071 1727204645.73324: checking to see if all hosts have failed and the running result is not ok 44071 1727204645.73325: done checking to see if all hosts have failed 44071 1727204645.73325: getting the remaining hosts for this loop 44071 1727204645.73327: done getting the remaining hosts for this loop 44071 1727204645.73333: getting the next task for host managed-node2 44071 1727204645.73343: done getting next task for host managed-node2 44071 1727204645.73347: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204645.73353: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204645.73375: getting variables 44071 1727204645.73377: in VariableManager get_vars() 44071 1727204645.73416: Calling all_inventory to load vars for managed-node2 44071 1727204645.73419: Calling groups_inventory to load vars for managed-node2 44071 1727204645.73422: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204645.73444: Calling all_plugins_play to load vars for managed-node2 44071 1727204645.73447: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204645.73451: Calling groups_plugins_play to load vars for managed-node2 44071 1727204645.75272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204645.77542: done with get_vars() 44071 1727204645.77586: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:04:05 -0400 (0:00:00.066) 0:00:58.093 ***** 44071 1727204645.77705: entering _queue_task() for managed-node2/service_facts 44071 1727204645.78119: worker is 1 (out of 1 available) 44071 1727204645.78135: exiting _queue_task() for managed-node2/service_facts 44071 1727204645.78152: done queuing things up, now waiting for results queue to drain 44071 1727204645.78154: waiting for pending results... 44071 1727204645.78589: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204645.78686: in run() - task 127b8e07-fff9-c964-7471-000000001157 44071 1727204645.78692: variable 'ansible_search_path' from source: unknown 44071 1727204645.78696: variable 'ansible_search_path' from source: unknown 44071 1727204645.78699: calling self._execute() 44071 1727204645.78807: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.78815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.78826: variable 'omit' from source: magic vars 44071 1727204645.79260: variable 'ansible_distribution_major_version' from source: facts 44071 1727204645.79276: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204645.79338: variable 'omit' from source: magic vars 44071 1727204645.79387: variable 'omit' from source: magic vars 44071 1727204645.79426: variable 'omit' from source: magic vars 44071 1727204645.79479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204645.79519: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204645.79541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204645.79573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204645.79586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204645.79662: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204645.79667: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.79670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.79745: Set connection var ansible_connection to ssh 44071 1727204645.79752: Set connection var ansible_timeout to 10 44071 1727204645.79759: Set connection var ansible_pipelining to False 44071 1727204645.79768: Set connection var ansible_shell_type to sh 44071 1727204645.79777: Set connection var ansible_shell_executable to /bin/sh 44071 1727204645.79873: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204645.79878: variable 'ansible_shell_executable' from source: unknown 44071 1727204645.79881: variable 'ansible_connection' from source: unknown 44071 1727204645.79884: variable 'ansible_module_compression' from source: unknown 44071 1727204645.79886: variable 'ansible_shell_type' from source: unknown 44071 1727204645.79889: variable 'ansible_shell_executable' from source: unknown 44071 1727204645.79891: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204645.79893: variable 'ansible_pipelining' from source: unknown 44071 1727204645.79896: variable 'ansible_timeout' from source: unknown 44071 1727204645.79898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204645.80123: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204645.80128: variable 'omit' from source: magic vars 44071 1727204645.80133: starting attempt loop 44071 1727204645.80136: running the handler 44071 1727204645.80138: _low_level_execute_command(): starting 44071 1727204645.80171: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204645.81069: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204645.81118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204645.81204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204645.82999: stdout chunk (state=3): >>>/root <<< 44071 1727204645.83099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204645.83278: stderr chunk (state=3): >>><<< 44071 1727204645.83282: stdout chunk (state=3): >>><<< 44071 1727204645.83287: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204645.83290: _low_level_execute_command(): starting 44071 1727204645.83292: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204645.832163-47281-66269157662699 `" && echo ansible-tmp-1727204645.832163-47281-66269157662699="` echo /root/.ansible/tmp/ansible-tmp-1727204645.832163-47281-66269157662699 `" ) && sleep 0' 44071 1727204645.83902: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204645.83909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204645.83985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204645.84024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204645.84041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204645.84070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204645.84180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204645.86181: stdout chunk (state=3): >>>ansible-tmp-1727204645.832163-47281-66269157662699=/root/.ansible/tmp/ansible-tmp-1727204645.832163-47281-66269157662699 <<< 44071 1727204645.86406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204645.86410: stdout chunk (state=3): >>><<< 44071 1727204645.86414: stderr chunk (state=3): >>><<< 44071 1727204645.86473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204645.832163-47281-66269157662699=/root/.ansible/tmp/ansible-tmp-1727204645.832163-47281-66269157662699 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204645.86507: variable 'ansible_module_compression' from source: unknown 44071 1727204645.86672: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44071 1727204645.86675: variable 'ansible_facts' from source: unknown 44071 1727204645.86726: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204645.832163-47281-66269157662699/AnsiballZ_service_facts.py 44071 1727204645.86976: Sending initial data 44071 1727204645.86980: Sent initial data (160 bytes) 44071 1727204645.87530: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204645.87548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204645.87572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204645.87613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204645.87641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204645.87711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204645.89333: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204645.89401: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204645.89523: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp24vsvptz /root/.ansible/tmp/ansible-tmp-1727204645.832163-47281-66269157662699/AnsiballZ_service_facts.py <<< 44071 1727204645.89527: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204645.832163-47281-66269157662699/AnsiballZ_service_facts.py" <<< 44071 1727204645.89610: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp24vsvptz" to remote "/root/.ansible/tmp/ansible-tmp-1727204645.832163-47281-66269157662699/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204645.832163-47281-66269157662699/AnsiballZ_service_facts.py" <<< 44071 1727204645.90575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204645.90583: stderr chunk (state=3): >>><<< 44071 1727204645.90588: stdout chunk (state=3): >>><<< 44071 1727204645.90613: done transferring module to remote 44071 1727204645.90626: _low_level_execute_command(): starting 44071 1727204645.90631: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204645.832163-47281-66269157662699/ /root/.ansible/tmp/ansible-tmp-1727204645.832163-47281-66269157662699/AnsiballZ_service_facts.py && sleep 0' 44071 1727204645.91271: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204645.91280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204645.91317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204645.91324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204645.91333: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204645.91336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204645.91339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204645.91393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204645.91396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204645.91464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204645.93488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204645.93493: stdout chunk (state=3): >>><<< 44071 1727204645.93496: stderr chunk (state=3): >>><<< 44071 1727204645.93499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204645.93502: _low_level_execute_command(): starting 44071 1727204645.93504: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204645.832163-47281-66269157662699/AnsiballZ_service_facts.py && sleep 0' 44071 1727204645.94102: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204645.94119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204645.94176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204645.94195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204645.94267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204648.16120: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped<<< 44071 1727204648.16186: stdout chunk (state=3): >>>", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44071 1727204648.17802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204648.17896: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 44071 1727204648.17941: stderr chunk (state=3): >>><<< 44071 1727204648.17954: stdout chunk (state=3): >>><<< 44071 1727204648.18171: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204648.19491: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204645.832163-47281-66269157662699/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204648.19519: _low_level_execute_command(): starting 44071 1727204648.19528: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204645.832163-47281-66269157662699/ > /dev/null 2>&1 && sleep 0' 44071 1727204648.20230: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204648.20286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204648.20376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204648.20396: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204648.20420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204648.20441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204648.20468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204648.20582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204648.22633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204648.22659: stdout chunk (state=3): >>><<< 44071 1727204648.22662: stderr chunk (state=3): >>><<< 44071 1727204648.22680: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204648.22872: handler run complete 44071 1727204648.22947: variable 'ansible_facts' from source: unknown 44071 1727204648.23174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204648.23812: variable 'ansible_facts' from source: unknown 44071 1727204648.23996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204648.24282: attempt loop complete, returning result 44071 1727204648.24294: _execute() done 44071 1727204648.24302: dumping result to json 44071 1727204648.24384: done dumping result, returning 44071 1727204648.24399: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-c964-7471-000000001157] 44071 1727204648.24409: sending task result for task 127b8e07-fff9-c964-7471-000000001157 44071 1727204648.26056: done sending task result for task 127b8e07-fff9-c964-7471-000000001157 44071 1727204648.26060: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204648.26173: no more pending results, returning what we have 44071 1727204648.26176: results queue empty 44071 1727204648.26177: checking for any_errors_fatal 44071 1727204648.26182: done checking for any_errors_fatal 44071 1727204648.26182: checking for max_fail_percentage 44071 1727204648.26184: done checking for max_fail_percentage 44071 1727204648.26185: checking to see if all hosts have failed and the running result is not ok 44071 1727204648.26185: done checking to see if all hosts have failed 44071 1727204648.26186: getting the remaining hosts for this loop 44071 1727204648.26187: done getting the remaining hosts for this loop 44071 1727204648.26191: getting the next task for host managed-node2 44071 1727204648.26198: done getting next task for host managed-node2 44071 1727204648.26201: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204648.26208: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204648.26219: getting variables 44071 1727204648.26221: in VariableManager get_vars() 44071 1727204648.26253: Calling all_inventory to load vars for managed-node2 44071 1727204648.26256: Calling groups_inventory to load vars for managed-node2 44071 1727204648.26258: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204648.26269: Calling all_plugins_play to load vars for managed-node2 44071 1727204648.26272: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204648.26276: Calling groups_plugins_play to load vars for managed-node2 44071 1727204648.27920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204648.30183: done with get_vars() 44071 1727204648.30220: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:04:08 -0400 (0:00:02.526) 0:01:00.619 ***** 44071 1727204648.30347: entering _queue_task() for managed-node2/package_facts 44071 1727204648.30794: worker is 1 (out of 1 available) 44071 1727204648.30811: exiting _queue_task() for managed-node2/package_facts 44071 1727204648.30828: done queuing things up, now waiting for results queue to drain 44071 1727204648.30833: waiting for pending results... 44071 1727204648.31201: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204648.31476: in run() - task 127b8e07-fff9-c964-7471-000000001158 44071 1727204648.31480: variable 'ansible_search_path' from source: unknown 44071 1727204648.31483: variable 'ansible_search_path' from source: unknown 44071 1727204648.31502: calling self._execute() 44071 1727204648.31634: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204648.31649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204648.31667: variable 'omit' from source: magic vars 44071 1727204648.32112: variable 'ansible_distribution_major_version' from source: facts 44071 1727204648.32271: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204648.32275: variable 'omit' from source: magic vars 44071 1727204648.32278: variable 'omit' from source: magic vars 44071 1727204648.32292: variable 'omit' from source: magic vars 44071 1727204648.32344: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204648.32396: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204648.32427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204648.32455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204648.32476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204648.32518: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204648.32528: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204648.32541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204648.32668: Set connection var ansible_connection to ssh 44071 1727204648.32683: Set connection var ansible_timeout to 10 44071 1727204648.32693: Set connection var ansible_pipelining to False 44071 1727204648.32703: Set connection var ansible_shell_type to sh 44071 1727204648.32780: Set connection var ansible_shell_executable to /bin/sh 44071 1727204648.32785: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204648.32787: variable 'ansible_shell_executable' from source: unknown 44071 1727204648.32789: variable 'ansible_connection' from source: unknown 44071 1727204648.32792: variable 'ansible_module_compression' from source: unknown 44071 1727204648.32795: variable 'ansible_shell_type' from source: unknown 44071 1727204648.32797: variable 'ansible_shell_executable' from source: unknown 44071 1727204648.32799: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204648.32802: variable 'ansible_pipelining' from source: unknown 44071 1727204648.32804: variable 'ansible_timeout' from source: unknown 44071 1727204648.32807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204648.33141: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204648.33146: variable 'omit' from source: magic vars 44071 1727204648.33149: starting attempt loop 44071 1727204648.33152: running the handler 44071 1727204648.33154: _low_level_execute_command(): starting 44071 1727204648.33156: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204648.33889: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204648.33913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204648.33928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204648.34022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204648.34055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204648.34073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204648.34095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204648.34206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204648.35967: stdout chunk (state=3): >>>/root <<< 44071 1727204648.36196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204648.36214: stderr chunk (state=3): >>><<< 44071 1727204648.36218: stdout chunk (state=3): >>><<< 44071 1727204648.36260: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204648.36264: _low_level_execute_command(): starting 44071 1727204648.36276: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204648.362506-47339-120004382155046 `" && echo ansible-tmp-1727204648.362506-47339-120004382155046="` echo /root/.ansible/tmp/ansible-tmp-1727204648.362506-47339-120004382155046 `" ) && sleep 0' 44071 1727204648.36781: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204648.36784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204648.36796: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204648.36799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204648.36846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204648.36850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204648.36857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204648.36926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204648.38951: stdout chunk (state=3): >>>ansible-tmp-1727204648.362506-47339-120004382155046=/root/.ansible/tmp/ansible-tmp-1727204648.362506-47339-120004382155046 <<< 44071 1727204648.39077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204648.39163: stderr chunk (state=3): >>><<< 44071 1727204648.39176: stdout chunk (state=3): >>><<< 44071 1727204648.39179: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204648.362506-47339-120004382155046=/root/.ansible/tmp/ansible-tmp-1727204648.362506-47339-120004382155046 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204648.39187: variable 'ansible_module_compression' from source: unknown 44071 1727204648.39229: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44071 1727204648.39288: variable 'ansible_facts' from source: unknown 44071 1727204648.39413: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204648.362506-47339-120004382155046/AnsiballZ_package_facts.py 44071 1727204648.39540: Sending initial data 44071 1727204648.39543: Sent initial data (161 bytes) 44071 1727204648.40053: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204648.40103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204648.40107: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204648.40186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204648.41835: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204648.41914: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204648.42003: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmppg8tue07 /root/.ansible/tmp/ansible-tmp-1727204648.362506-47339-120004382155046/AnsiballZ_package_facts.py <<< 44071 1727204648.42007: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204648.362506-47339-120004382155046/AnsiballZ_package_facts.py" <<< 44071 1727204648.42100: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmppg8tue07" to remote "/root/.ansible/tmp/ansible-tmp-1727204648.362506-47339-120004382155046/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204648.362506-47339-120004382155046/AnsiballZ_package_facts.py" <<< 44071 1727204648.43336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204648.43410: stderr chunk (state=3): >>><<< 44071 1727204648.43414: stdout chunk (state=3): >>><<< 44071 1727204648.43437: done transferring module to remote 44071 1727204648.43448: _low_level_execute_command(): starting 44071 1727204648.43454: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204648.362506-47339-120004382155046/ /root/.ansible/tmp/ansible-tmp-1727204648.362506-47339-120004382155046/AnsiballZ_package_facts.py && sleep 0' 44071 1727204648.43988: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204648.44085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204648.44094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204648.44149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204648.45989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204648.46053: stderr chunk (state=3): >>><<< 44071 1727204648.46057: stdout chunk (state=3): >>><<< 44071 1727204648.46071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204648.46075: _low_level_execute_command(): starting 44071 1727204648.46082: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204648.362506-47339-120004382155046/AnsiballZ_package_facts.py && sleep 0' 44071 1727204648.46545: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204648.46549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204648.46583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204648.46587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204648.46590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204648.46649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204648.46652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204648.46737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204649.09748: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "li<<< 44071 1727204649.09774: stdout chunk (state=3): >>>breport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-l<<< 44071 1727204649.09877: stdout chunk (state=3): >>>ibs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44071 1727204649.11777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204649.11787: stderr chunk (state=3): >>><<< 44071 1727204649.11790: stdout chunk (state=3): >>><<< 44071 1727204649.11837: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204649.18873: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204648.362506-47339-120004382155046/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204649.18878: _low_level_execute_command(): starting 44071 1727204649.18881: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204648.362506-47339-120004382155046/ > /dev/null 2>&1 && sleep 0' 44071 1727204649.20191: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204649.20196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204649.20324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204649.20328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204649.20345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204649.20518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204649.20574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204649.22640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204649.22869: stderr chunk (state=3): >>><<< 44071 1727204649.22873: stdout chunk (state=3): >>><<< 44071 1727204649.22891: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204649.22898: handler run complete 44071 1727204649.25507: variable 'ansible_facts' from source: unknown 44071 1727204649.36286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204649.38875: variable 'ansible_facts' from source: unknown 44071 1727204649.39209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204649.40198: attempt loop complete, returning result 44071 1727204649.40202: _execute() done 44071 1727204649.40205: dumping result to json 44071 1727204649.40414: done dumping result, returning 44071 1727204649.40422: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-c964-7471-000000001158] 44071 1727204649.40425: sending task result for task 127b8e07-fff9-c964-7471-000000001158 44071 1727204649.50471: done sending task result for task 127b8e07-fff9-c964-7471-000000001158 44071 1727204649.50475: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204649.50618: no more pending results, returning what we have 44071 1727204649.50621: results queue empty 44071 1727204649.50621: checking for any_errors_fatal 44071 1727204649.50627: done checking for any_errors_fatal 44071 1727204649.50628: checking for max_fail_percentage 44071 1727204649.50629: done checking for max_fail_percentage 44071 1727204649.50630: checking to see if all hosts have failed and the running result is not ok 44071 1727204649.50633: done checking to see if all hosts have failed 44071 1727204649.50633: getting the remaining hosts for this loop 44071 1727204649.50635: done getting the remaining hosts for this loop 44071 1727204649.50638: getting the next task for host managed-node2 44071 1727204649.50655: done getting next task for host managed-node2 44071 1727204649.50659: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204649.50673: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204649.50682: getting variables 44071 1727204649.50683: in VariableManager get_vars() 44071 1727204649.50699: Calling all_inventory to load vars for managed-node2 44071 1727204649.50701: Calling groups_inventory to load vars for managed-node2 44071 1727204649.50702: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204649.50707: Calling all_plugins_play to load vars for managed-node2 44071 1727204649.50709: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204649.50711: Calling groups_plugins_play to load vars for managed-node2 44071 1727204649.51625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204649.52853: done with get_vars() 44071 1727204649.52883: done getting variables 44071 1727204649.52925: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:04:09 -0400 (0:00:01.226) 0:01:01.846 ***** 44071 1727204649.52961: entering _queue_task() for managed-node2/debug 44071 1727204649.53267: worker is 1 (out of 1 available) 44071 1727204649.53284: exiting _queue_task() for managed-node2/debug 44071 1727204649.53298: done queuing things up, now waiting for results queue to drain 44071 1727204649.53300: waiting for pending results... 44071 1727204649.53508: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204649.53636: in run() - task 127b8e07-fff9-c964-7471-0000000010f6 44071 1727204649.53650: variable 'ansible_search_path' from source: unknown 44071 1727204649.53654: variable 'ansible_search_path' from source: unknown 44071 1727204649.53690: calling self._execute() 44071 1727204649.53788: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204649.53795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204649.53805: variable 'omit' from source: magic vars 44071 1727204649.54119: variable 'ansible_distribution_major_version' from source: facts 44071 1727204649.54133: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204649.54137: variable 'omit' from source: magic vars 44071 1727204649.54188: variable 'omit' from source: magic vars 44071 1727204649.54267: variable 'network_provider' from source: set_fact 44071 1727204649.54282: variable 'omit' from source: magic vars 44071 1727204649.54320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204649.54353: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204649.54372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204649.54387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204649.54400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204649.54428: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204649.54435: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204649.54438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204649.54514: Set connection var ansible_connection to ssh 44071 1727204649.54525: Set connection var ansible_timeout to 10 44071 1727204649.54529: Set connection var ansible_pipelining to False 44071 1727204649.54534: Set connection var ansible_shell_type to sh 44071 1727204649.54537: Set connection var ansible_shell_executable to /bin/sh 44071 1727204649.54544: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204649.54568: variable 'ansible_shell_executable' from source: unknown 44071 1727204649.54572: variable 'ansible_connection' from source: unknown 44071 1727204649.54575: variable 'ansible_module_compression' from source: unknown 44071 1727204649.54577: variable 'ansible_shell_type' from source: unknown 44071 1727204649.54580: variable 'ansible_shell_executable' from source: unknown 44071 1727204649.54583: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204649.54585: variable 'ansible_pipelining' from source: unknown 44071 1727204649.54587: variable 'ansible_timeout' from source: unknown 44071 1727204649.54593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204649.54708: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204649.54718: variable 'omit' from source: magic vars 44071 1727204649.54723: starting attempt loop 44071 1727204649.54728: running the handler 44071 1727204649.54770: handler run complete 44071 1727204649.54783: attempt loop complete, returning result 44071 1727204649.54786: _execute() done 44071 1727204649.54789: dumping result to json 44071 1727204649.54791: done dumping result, returning 44071 1727204649.54799: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-c964-7471-0000000010f6] 44071 1727204649.54804: sending task result for task 127b8e07-fff9-c964-7471-0000000010f6 44071 1727204649.54904: done sending task result for task 127b8e07-fff9-c964-7471-0000000010f6 44071 1727204649.54907: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 44071 1727204649.54991: no more pending results, returning what we have 44071 1727204649.54995: results queue empty 44071 1727204649.54995: checking for any_errors_fatal 44071 1727204649.55008: done checking for any_errors_fatal 44071 1727204649.55009: checking for max_fail_percentage 44071 1727204649.55010: done checking for max_fail_percentage 44071 1727204649.55011: checking to see if all hosts have failed and the running result is not ok 44071 1727204649.55012: done checking to see if all hosts have failed 44071 1727204649.55013: getting the remaining hosts for this loop 44071 1727204649.55014: done getting the remaining hosts for this loop 44071 1727204649.55026: getting the next task for host managed-node2 44071 1727204649.55037: done getting next task for host managed-node2 44071 1727204649.55041: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204649.55046: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204649.55059: getting variables 44071 1727204649.55061: in VariableManager get_vars() 44071 1727204649.55103: Calling all_inventory to load vars for managed-node2 44071 1727204649.55106: Calling groups_inventory to load vars for managed-node2 44071 1727204649.55109: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204649.55118: Calling all_plugins_play to load vars for managed-node2 44071 1727204649.55120: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204649.55123: Calling groups_plugins_play to load vars for managed-node2 44071 1727204649.56215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204649.57459: done with get_vars() 44071 1727204649.57493: done getting variables 44071 1727204649.57545: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:04:09 -0400 (0:00:00.046) 0:01:01.892 ***** 44071 1727204649.57583: entering _queue_task() for managed-node2/fail 44071 1727204649.57880: worker is 1 (out of 1 available) 44071 1727204649.57897: exiting _queue_task() for managed-node2/fail 44071 1727204649.57911: done queuing things up, now waiting for results queue to drain 44071 1727204649.57913: waiting for pending results... 44071 1727204649.58135: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204649.58255: in run() - task 127b8e07-fff9-c964-7471-0000000010f7 44071 1727204649.58275: variable 'ansible_search_path' from source: unknown 44071 1727204649.58279: variable 'ansible_search_path' from source: unknown 44071 1727204649.58313: calling self._execute() 44071 1727204649.58403: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204649.58410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204649.58419: variable 'omit' from source: magic vars 44071 1727204649.58730: variable 'ansible_distribution_major_version' from source: facts 44071 1727204649.58742: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204649.58837: variable 'network_state' from source: role '' defaults 44071 1727204649.58848: Evaluated conditional (network_state != {}): False 44071 1727204649.58851: when evaluation is False, skipping this task 44071 1727204649.58854: _execute() done 44071 1727204649.58857: dumping result to json 44071 1727204649.58860: done dumping result, returning 44071 1727204649.58869: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-c964-7471-0000000010f7] 44071 1727204649.58875: sending task result for task 127b8e07-fff9-c964-7471-0000000010f7 44071 1727204649.58983: done sending task result for task 127b8e07-fff9-c964-7471-0000000010f7 44071 1727204649.58987: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204649.59039: no more pending results, returning what we have 44071 1727204649.59046: results queue empty 44071 1727204649.59047: checking for any_errors_fatal 44071 1727204649.59056: done checking for any_errors_fatal 44071 1727204649.59057: checking for max_fail_percentage 44071 1727204649.59058: done checking for max_fail_percentage 44071 1727204649.59059: checking to see if all hosts have failed and the running result is not ok 44071 1727204649.59060: done checking to see if all hosts have failed 44071 1727204649.59061: getting the remaining hosts for this loop 44071 1727204649.59062: done getting the remaining hosts for this loop 44071 1727204649.59075: getting the next task for host managed-node2 44071 1727204649.59085: done getting next task for host managed-node2 44071 1727204649.59090: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204649.59095: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204649.59116: getting variables 44071 1727204649.59118: in VariableManager get_vars() 44071 1727204649.59157: Calling all_inventory to load vars for managed-node2 44071 1727204649.59160: Calling groups_inventory to load vars for managed-node2 44071 1727204649.59162: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204649.59173: Calling all_plugins_play to load vars for managed-node2 44071 1727204649.59175: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204649.59185: Calling groups_plugins_play to load vars for managed-node2 44071 1727204649.60341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204649.61592: done with get_vars() 44071 1727204649.61623: done getting variables 44071 1727204649.61680: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:04:09 -0400 (0:00:00.041) 0:01:01.933 ***** 44071 1727204649.61710: entering _queue_task() for managed-node2/fail 44071 1727204649.62015: worker is 1 (out of 1 available) 44071 1727204649.62029: exiting _queue_task() for managed-node2/fail 44071 1727204649.62048: done queuing things up, now waiting for results queue to drain 44071 1727204649.62050: waiting for pending results... 44071 1727204649.62269: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204649.62405: in run() - task 127b8e07-fff9-c964-7471-0000000010f8 44071 1727204649.62418: variable 'ansible_search_path' from source: unknown 44071 1727204649.62421: variable 'ansible_search_path' from source: unknown 44071 1727204649.62464: calling self._execute() 44071 1727204649.62555: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204649.62561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204649.62571: variable 'omit' from source: magic vars 44071 1727204649.62900: variable 'ansible_distribution_major_version' from source: facts 44071 1727204649.62911: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204649.63009: variable 'network_state' from source: role '' defaults 44071 1727204649.63019: Evaluated conditional (network_state != {}): False 44071 1727204649.63024: when evaluation is False, skipping this task 44071 1727204649.63027: _execute() done 44071 1727204649.63030: dumping result to json 44071 1727204649.63033: done dumping result, returning 44071 1727204649.63046: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-c964-7471-0000000010f8] 44071 1727204649.63051: sending task result for task 127b8e07-fff9-c964-7471-0000000010f8 44071 1727204649.63155: done sending task result for task 127b8e07-fff9-c964-7471-0000000010f8 44071 1727204649.63158: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204649.63211: no more pending results, returning what we have 44071 1727204649.63216: results queue empty 44071 1727204649.63217: checking for any_errors_fatal 44071 1727204649.63227: done checking for any_errors_fatal 44071 1727204649.63228: checking for max_fail_percentage 44071 1727204649.63230: done checking for max_fail_percentage 44071 1727204649.63231: checking to see if all hosts have failed and the running result is not ok 44071 1727204649.63231: done checking to see if all hosts have failed 44071 1727204649.63232: getting the remaining hosts for this loop 44071 1727204649.63234: done getting the remaining hosts for this loop 44071 1727204649.63239: getting the next task for host managed-node2 44071 1727204649.63248: done getting next task for host managed-node2 44071 1727204649.63252: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204649.63259: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204649.63289: getting variables 44071 1727204649.63291: in VariableManager get_vars() 44071 1727204649.63330: Calling all_inventory to load vars for managed-node2 44071 1727204649.63333: Calling groups_inventory to load vars for managed-node2 44071 1727204649.63335: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204649.63346: Calling all_plugins_play to load vars for managed-node2 44071 1727204649.63348: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204649.63351: Calling groups_plugins_play to load vars for managed-node2 44071 1727204649.64383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204649.65610: done with get_vars() 44071 1727204649.65643: done getting variables 44071 1727204649.65698: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:04:09 -0400 (0:00:00.040) 0:01:01.973 ***** 44071 1727204649.65728: entering _queue_task() for managed-node2/fail 44071 1727204649.66030: worker is 1 (out of 1 available) 44071 1727204649.66046: exiting _queue_task() for managed-node2/fail 44071 1727204649.66062: done queuing things up, now waiting for results queue to drain 44071 1727204649.66064: waiting for pending results... 44071 1727204649.66279: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204649.66408: in run() - task 127b8e07-fff9-c964-7471-0000000010f9 44071 1727204649.66418: variable 'ansible_search_path' from source: unknown 44071 1727204649.66422: variable 'ansible_search_path' from source: unknown 44071 1727204649.66458: calling self._execute() 44071 1727204649.66546: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204649.66552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204649.66563: variable 'omit' from source: magic vars 44071 1727204649.66883: variable 'ansible_distribution_major_version' from source: facts 44071 1727204649.66895: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204649.67035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204649.69131: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204649.69192: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204649.69223: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204649.69256: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204649.69280: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204649.69351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204649.69376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204649.69398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204649.69426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204649.69440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204649.69520: variable 'ansible_distribution_major_version' from source: facts 44071 1727204649.69537: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44071 1727204649.69634: variable 'ansible_distribution' from source: facts 44071 1727204649.69640: variable '__network_rh_distros' from source: role '' defaults 44071 1727204649.69650: Evaluated conditional (ansible_distribution in __network_rh_distros): False 44071 1727204649.69652: when evaluation is False, skipping this task 44071 1727204649.69655: _execute() done 44071 1727204649.69658: dumping result to json 44071 1727204649.69664: done dumping result, returning 44071 1727204649.69675: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-c964-7471-0000000010f9] 44071 1727204649.69678: sending task result for task 127b8e07-fff9-c964-7471-0000000010f9 44071 1727204649.69783: done sending task result for task 127b8e07-fff9-c964-7471-0000000010f9 44071 1727204649.69786: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 44071 1727204649.69838: no more pending results, returning what we have 44071 1727204649.69842: results queue empty 44071 1727204649.69843: checking for any_errors_fatal 44071 1727204649.69850: done checking for any_errors_fatal 44071 1727204649.69851: checking for max_fail_percentage 44071 1727204649.69852: done checking for max_fail_percentage 44071 1727204649.69854: checking to see if all hosts have failed and the running result is not ok 44071 1727204649.69854: done checking to see if all hosts have failed 44071 1727204649.69855: getting the remaining hosts for this loop 44071 1727204649.69856: done getting the remaining hosts for this loop 44071 1727204649.69861: getting the next task for host managed-node2 44071 1727204649.69872: done getting next task for host managed-node2 44071 1727204649.69877: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204649.69883: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204649.69907: getting variables 44071 1727204649.69908: in VariableManager get_vars() 44071 1727204649.69950: Calling all_inventory to load vars for managed-node2 44071 1727204649.69953: Calling groups_inventory to load vars for managed-node2 44071 1727204649.69955: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204649.69974: Calling all_plugins_play to load vars for managed-node2 44071 1727204649.69977: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204649.69980: Calling groups_plugins_play to load vars for managed-node2 44071 1727204649.71177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204649.72406: done with get_vars() 44071 1727204649.72439: done getting variables 44071 1727204649.72500: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:04:09 -0400 (0:00:00.067) 0:01:02.041 ***** 44071 1727204649.72528: entering _queue_task() for managed-node2/dnf 44071 1727204649.72835: worker is 1 (out of 1 available) 44071 1727204649.72849: exiting _queue_task() for managed-node2/dnf 44071 1727204649.72864: done queuing things up, now waiting for results queue to drain 44071 1727204649.72867: waiting for pending results... 44071 1727204649.73087: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204649.73206: in run() - task 127b8e07-fff9-c964-7471-0000000010fa 44071 1727204649.73331: variable 'ansible_search_path' from source: unknown 44071 1727204649.73335: variable 'ansible_search_path' from source: unknown 44071 1727204649.73338: calling self._execute() 44071 1727204649.73355: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204649.73364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204649.73373: variable 'omit' from source: magic vars 44071 1727204649.73697: variable 'ansible_distribution_major_version' from source: facts 44071 1727204649.73708: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204649.73884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204649.75676: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204649.75729: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204649.75763: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204649.75794: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204649.75816: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204649.75889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204649.75922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204649.75945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204649.75980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204649.75991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204649.76093: variable 'ansible_distribution' from source: facts 44071 1727204649.76097: variable 'ansible_distribution_major_version' from source: facts 44071 1727204649.76105: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44071 1727204649.76197: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204649.76296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204649.76314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204649.76331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204649.76362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204649.76375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204649.76411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204649.76429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204649.76449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204649.76479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204649.76491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204649.76523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204649.76543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204649.76560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204649.76590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204649.76602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204649.76714: variable 'network_connections' from source: include params 44071 1727204649.76727: variable 'interface' from source: play vars 44071 1727204649.76777: variable 'interface' from source: play vars 44071 1727204649.76841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204649.76973: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204649.77004: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204649.77031: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204649.77058: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204649.77094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204649.77112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204649.77135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204649.77158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204649.77217: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204649.77772: variable 'network_connections' from source: include params 44071 1727204649.77972: variable 'interface' from source: play vars 44071 1727204649.77976: variable 'interface' from source: play vars 44071 1727204649.77979: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204649.77982: when evaluation is False, skipping this task 44071 1727204649.77983: _execute() done 44071 1727204649.77986: dumping result to json 44071 1727204649.77988: done dumping result, returning 44071 1727204649.77990: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-0000000010fa] 44071 1727204649.77993: sending task result for task 127b8e07-fff9-c964-7471-0000000010fa skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204649.78157: no more pending results, returning what we have 44071 1727204649.78161: results queue empty 44071 1727204649.78163: checking for any_errors_fatal 44071 1727204649.78250: done checking for any_errors_fatal 44071 1727204649.78252: checking for max_fail_percentage 44071 1727204649.78255: done checking for max_fail_percentage 44071 1727204649.78256: checking to see if all hosts have failed and the running result is not ok 44071 1727204649.78257: done checking to see if all hosts have failed 44071 1727204649.78257: getting the remaining hosts for this loop 44071 1727204649.78259: done getting the remaining hosts for this loop 44071 1727204649.78265: getting the next task for host managed-node2 44071 1727204649.78275: done getting next task for host managed-node2 44071 1727204649.78280: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204649.78286: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204649.78313: getting variables 44071 1727204649.78314: in VariableManager get_vars() 44071 1727204649.78357: Calling all_inventory to load vars for managed-node2 44071 1727204649.78360: Calling groups_inventory to load vars for managed-node2 44071 1727204649.78362: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204649.78479: done sending task result for task 127b8e07-fff9-c964-7471-0000000010fa 44071 1727204649.78483: WORKER PROCESS EXITING 44071 1727204649.78561: Calling all_plugins_play to load vars for managed-node2 44071 1727204649.78568: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204649.78572: Calling groups_plugins_play to load vars for managed-node2 44071 1727204649.79804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204649.81338: done with get_vars() 44071 1727204649.81381: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204649.81464: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:04:09 -0400 (0:00:00.089) 0:01:02.131 ***** 44071 1727204649.81503: entering _queue_task() for managed-node2/yum 44071 1727204649.81910: worker is 1 (out of 1 available) 44071 1727204649.81925: exiting _queue_task() for managed-node2/yum 44071 1727204649.81943: done queuing things up, now waiting for results queue to drain 44071 1727204649.81945: waiting for pending results... 44071 1727204649.82290: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204649.82572: in run() - task 127b8e07-fff9-c964-7471-0000000010fb 44071 1727204649.82577: variable 'ansible_search_path' from source: unknown 44071 1727204649.82580: variable 'ansible_search_path' from source: unknown 44071 1727204649.82583: calling self._execute() 44071 1727204649.82675: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204649.82689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204649.82707: variable 'omit' from source: magic vars 44071 1727204649.83160: variable 'ansible_distribution_major_version' from source: facts 44071 1727204649.83207: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204649.83876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204649.88955: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204649.89099: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204649.89262: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204649.89310: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204649.89388: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204649.89563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204649.89873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204649.89876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204649.89926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204649.90030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204649.90276: variable 'ansible_distribution_major_version' from source: facts 44071 1727204649.90300: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44071 1727204649.90307: when evaluation is False, skipping this task 44071 1727204649.90313: _execute() done 44071 1727204649.90320: dumping result to json 44071 1727204649.90327: done dumping result, returning 44071 1727204649.90559: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-0000000010fb] 44071 1727204649.90563: sending task result for task 127b8e07-fff9-c964-7471-0000000010fb 44071 1727204649.90648: done sending task result for task 127b8e07-fff9-c964-7471-0000000010fb 44071 1727204649.90651: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44071 1727204649.90718: no more pending results, returning what we have 44071 1727204649.90723: results queue empty 44071 1727204649.90724: checking for any_errors_fatal 44071 1727204649.90735: done checking for any_errors_fatal 44071 1727204649.90736: checking for max_fail_percentage 44071 1727204649.90738: done checking for max_fail_percentage 44071 1727204649.90739: checking to see if all hosts have failed and the running result is not ok 44071 1727204649.90740: done checking to see if all hosts have failed 44071 1727204649.90741: getting the remaining hosts for this loop 44071 1727204649.90743: done getting the remaining hosts for this loop 44071 1727204649.90748: getting the next task for host managed-node2 44071 1727204649.90757: done getting next task for host managed-node2 44071 1727204649.90762: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204649.90771: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204649.90796: getting variables 44071 1727204649.90798: in VariableManager get_vars() 44071 1727204649.90846: Calling all_inventory to load vars for managed-node2 44071 1727204649.90849: Calling groups_inventory to load vars for managed-node2 44071 1727204649.90852: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204649.90864: Calling all_plugins_play to load vars for managed-node2 44071 1727204649.91073: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204649.91078: Calling groups_plugins_play to load vars for managed-node2 44071 1727204649.93917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204649.95244: done with get_vars() 44071 1727204649.95277: done getting variables 44071 1727204649.95328: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:04:09 -0400 (0:00:00.138) 0:01:02.270 ***** 44071 1727204649.95361: entering _queue_task() for managed-node2/fail 44071 1727204649.95664: worker is 1 (out of 1 available) 44071 1727204649.95683: exiting _queue_task() for managed-node2/fail 44071 1727204649.95699: done queuing things up, now waiting for results queue to drain 44071 1727204649.95700: waiting for pending results... 44071 1727204649.95926: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204649.96311: in run() - task 127b8e07-fff9-c964-7471-0000000010fc 44071 1727204649.96316: variable 'ansible_search_path' from source: unknown 44071 1727204649.96319: variable 'ansible_search_path' from source: unknown 44071 1727204649.96322: calling self._execute() 44071 1727204649.96325: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204649.96328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204649.96331: variable 'omit' from source: magic vars 44071 1727204649.96795: variable 'ansible_distribution_major_version' from source: facts 44071 1727204649.96818: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204649.97018: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204649.97195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204649.99373: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204649.99378: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204649.99380: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204649.99418: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204649.99454: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204649.99552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204649.99596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204649.99634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204649.99686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204649.99707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204649.99780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204649.99815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204649.99843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204649.99893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204649.99912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204649.99968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204649.99999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204650.00033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.00085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204650.00106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204650.00371: variable 'network_connections' from source: include params 44071 1727204650.00374: variable 'interface' from source: play vars 44071 1727204650.00407: variable 'interface' from source: play vars 44071 1727204650.00496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204650.00726: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204650.00781: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204650.00864: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204650.00876: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204650.00900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204650.00917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204650.00939: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.00958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204650.01018: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204650.01219: variable 'network_connections' from source: include params 44071 1727204650.01225: variable 'interface' from source: play vars 44071 1727204650.01281: variable 'interface' from source: play vars 44071 1727204650.01310: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204650.01314: when evaluation is False, skipping this task 44071 1727204650.01317: _execute() done 44071 1727204650.01319: dumping result to json 44071 1727204650.01322: done dumping result, returning 44071 1727204650.01329: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-0000000010fc] 44071 1727204650.01335: sending task result for task 127b8e07-fff9-c964-7471-0000000010fc 44071 1727204650.01443: done sending task result for task 127b8e07-fff9-c964-7471-0000000010fc 44071 1727204650.01446: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204650.01505: no more pending results, returning what we have 44071 1727204650.01509: results queue empty 44071 1727204650.01510: checking for any_errors_fatal 44071 1727204650.01517: done checking for any_errors_fatal 44071 1727204650.01518: checking for max_fail_percentage 44071 1727204650.01520: done checking for max_fail_percentage 44071 1727204650.01521: checking to see if all hosts have failed and the running result is not ok 44071 1727204650.01521: done checking to see if all hosts have failed 44071 1727204650.01522: getting the remaining hosts for this loop 44071 1727204650.01524: done getting the remaining hosts for this loop 44071 1727204650.01528: getting the next task for host managed-node2 44071 1727204650.01540: done getting next task for host managed-node2 44071 1727204650.01544: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44071 1727204650.01550: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204650.01582: getting variables 44071 1727204650.01583: in VariableManager get_vars() 44071 1727204650.01624: Calling all_inventory to load vars for managed-node2 44071 1727204650.01627: Calling groups_inventory to load vars for managed-node2 44071 1727204650.01629: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204650.01642: Calling all_plugins_play to load vars for managed-node2 44071 1727204650.01644: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204650.01647: Calling groups_plugins_play to load vars for managed-node2 44071 1727204650.03009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204650.04986: done with get_vars() 44071 1727204650.05016: done getting variables 44071 1727204650.05073: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:04:10 -0400 (0:00:00.097) 0:01:02.367 ***** 44071 1727204650.05103: entering _queue_task() for managed-node2/package 44071 1727204650.05404: worker is 1 (out of 1 available) 44071 1727204650.05421: exiting _queue_task() for managed-node2/package 44071 1727204650.05439: done queuing things up, now waiting for results queue to drain 44071 1727204650.05441: waiting for pending results... 44071 1727204650.05652: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 44071 1727204650.05771: in run() - task 127b8e07-fff9-c964-7471-0000000010fd 44071 1727204650.05786: variable 'ansible_search_path' from source: unknown 44071 1727204650.05791: variable 'ansible_search_path' from source: unknown 44071 1727204650.05825: calling self._execute() 44071 1727204650.05912: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204650.05918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204650.05927: variable 'omit' from source: magic vars 44071 1727204650.06253: variable 'ansible_distribution_major_version' from source: facts 44071 1727204650.06265: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204650.06419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204650.06642: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204650.06770: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204650.06774: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204650.06875: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204650.07011: variable 'network_packages' from source: role '' defaults 44071 1727204650.07144: variable '__network_provider_setup' from source: role '' defaults 44071 1727204650.07163: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204650.07237: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204650.07251: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204650.07321: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204650.07573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204650.09260: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204650.09315: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204650.09346: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204650.09374: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204650.09399: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204650.09469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204650.09497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204650.09513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.09543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204650.09554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204650.09593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204650.09615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204650.09636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.09662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204650.09675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204650.09841: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204650.09934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204650.09949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204650.09969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.09997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204650.10008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204650.10083: variable 'ansible_python' from source: facts 44071 1727204650.10098: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204650.10162: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204650.10223: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204650.10321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204650.10340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204650.10359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.10393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204650.10403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204650.10443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204650.10463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204650.10485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.10513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204650.10524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204650.10635: variable 'network_connections' from source: include params 44071 1727204650.10639: variable 'interface' from source: play vars 44071 1727204650.10719: variable 'interface' from source: play vars 44071 1727204650.10959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204650.10982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204650.11004: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.11034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204650.11072: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204650.11284: variable 'network_connections' from source: include params 44071 1727204650.11287: variable 'interface' from source: play vars 44071 1727204650.11369: variable 'interface' from source: play vars 44071 1727204650.11410: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204650.11475: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204650.11697: variable 'network_connections' from source: include params 44071 1727204650.11700: variable 'interface' from source: play vars 44071 1727204650.11748: variable 'interface' from source: play vars 44071 1727204650.11771: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204650.11834: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204650.12047: variable 'network_connections' from source: include params 44071 1727204650.12051: variable 'interface' from source: play vars 44071 1727204650.12100: variable 'interface' from source: play vars 44071 1727204650.12153: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204650.12198: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204650.12204: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204650.12251: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204650.12410: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204650.12747: variable 'network_connections' from source: include params 44071 1727204650.12751: variable 'interface' from source: play vars 44071 1727204650.12801: variable 'interface' from source: play vars 44071 1727204650.12809: variable 'ansible_distribution' from source: facts 44071 1727204650.12812: variable '__network_rh_distros' from source: role '' defaults 44071 1727204650.12819: variable 'ansible_distribution_major_version' from source: facts 44071 1727204650.12841: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204650.12959: variable 'ansible_distribution' from source: facts 44071 1727204650.12963: variable '__network_rh_distros' from source: role '' defaults 44071 1727204650.12968: variable 'ansible_distribution_major_version' from source: facts 44071 1727204650.12976: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204650.13092: variable 'ansible_distribution' from source: facts 44071 1727204650.13096: variable '__network_rh_distros' from source: role '' defaults 44071 1727204650.13099: variable 'ansible_distribution_major_version' from source: facts 44071 1727204650.13130: variable 'network_provider' from source: set_fact 44071 1727204650.13143: variable 'ansible_facts' from source: unknown 44071 1727204650.13721: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44071 1727204650.13725: when evaluation is False, skipping this task 44071 1727204650.13729: _execute() done 44071 1727204650.13734: dumping result to json 44071 1727204650.13736: done dumping result, returning 44071 1727204650.13739: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-c964-7471-0000000010fd] 44071 1727204650.13745: sending task result for task 127b8e07-fff9-c964-7471-0000000010fd 44071 1727204650.13852: done sending task result for task 127b8e07-fff9-c964-7471-0000000010fd 44071 1727204650.13855: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44071 1727204650.13909: no more pending results, returning what we have 44071 1727204650.13913: results queue empty 44071 1727204650.13914: checking for any_errors_fatal 44071 1727204650.13923: done checking for any_errors_fatal 44071 1727204650.13924: checking for max_fail_percentage 44071 1727204650.13926: done checking for max_fail_percentage 44071 1727204650.13927: checking to see if all hosts have failed and the running result is not ok 44071 1727204650.13928: done checking to see if all hosts have failed 44071 1727204650.13929: getting the remaining hosts for this loop 44071 1727204650.13930: done getting the remaining hosts for this loop 44071 1727204650.13938: getting the next task for host managed-node2 44071 1727204650.13946: done getting next task for host managed-node2 44071 1727204650.13951: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204650.13956: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204650.13987: getting variables 44071 1727204650.13989: in VariableManager get_vars() 44071 1727204650.14038: Calling all_inventory to load vars for managed-node2 44071 1727204650.14040: Calling groups_inventory to load vars for managed-node2 44071 1727204650.14043: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204650.14054: Calling all_plugins_play to load vars for managed-node2 44071 1727204650.14056: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204650.14059: Calling groups_plugins_play to load vars for managed-node2 44071 1727204650.15252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204650.16486: done with get_vars() 44071 1727204650.16517: done getting variables 44071 1727204650.16572: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:04:10 -0400 (0:00:00.114) 0:01:02.482 ***** 44071 1727204650.16603: entering _queue_task() for managed-node2/package 44071 1727204650.16907: worker is 1 (out of 1 available) 44071 1727204650.16923: exiting _queue_task() for managed-node2/package 44071 1727204650.16940: done queuing things up, now waiting for results queue to drain 44071 1727204650.16942: waiting for pending results... 44071 1727204650.17168: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204650.17317: in run() - task 127b8e07-fff9-c964-7471-0000000010fe 44071 1727204650.17322: variable 'ansible_search_path' from source: unknown 44071 1727204650.17326: variable 'ansible_search_path' from source: unknown 44071 1727204650.17374: calling self._execute() 44071 1727204650.17483: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204650.17508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204650.17512: variable 'omit' from source: magic vars 44071 1727204650.17926: variable 'ansible_distribution_major_version' from source: facts 44071 1727204650.17948: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204650.18067: variable 'network_state' from source: role '' defaults 44071 1727204650.18090: Evaluated conditional (network_state != {}): False 44071 1727204650.18105: when evaluation is False, skipping this task 44071 1727204650.18109: _execute() done 44071 1727204650.18112: dumping result to json 44071 1727204650.18115: done dumping result, returning 44071 1727204650.18118: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-c964-7471-0000000010fe] 44071 1727204650.18122: sending task result for task 127b8e07-fff9-c964-7471-0000000010fe 44071 1727204650.18244: done sending task result for task 127b8e07-fff9-c964-7471-0000000010fe 44071 1727204650.18247: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204650.18310: no more pending results, returning what we have 44071 1727204650.18315: results queue empty 44071 1727204650.18316: checking for any_errors_fatal 44071 1727204650.18329: done checking for any_errors_fatal 44071 1727204650.18330: checking for max_fail_percentage 44071 1727204650.18332: done checking for max_fail_percentage 44071 1727204650.18333: checking to see if all hosts have failed and the running result is not ok 44071 1727204650.18333: done checking to see if all hosts have failed 44071 1727204650.18334: getting the remaining hosts for this loop 44071 1727204650.18336: done getting the remaining hosts for this loop 44071 1727204650.18341: getting the next task for host managed-node2 44071 1727204650.18349: done getting next task for host managed-node2 44071 1727204650.18353: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204650.18360: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204650.18387: getting variables 44071 1727204650.18388: in VariableManager get_vars() 44071 1727204650.18460: Calling all_inventory to load vars for managed-node2 44071 1727204650.18463: Calling groups_inventory to load vars for managed-node2 44071 1727204650.18467: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204650.18477: Calling all_plugins_play to load vars for managed-node2 44071 1727204650.18480: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204650.18483: Calling groups_plugins_play to load vars for managed-node2 44071 1727204650.19506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204650.20714: done with get_vars() 44071 1727204650.20746: done getting variables 44071 1727204650.20797: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:04:10 -0400 (0:00:00.042) 0:01:02.524 ***** 44071 1727204650.20826: entering _queue_task() for managed-node2/package 44071 1727204650.21129: worker is 1 (out of 1 available) 44071 1727204650.21144: exiting _queue_task() for managed-node2/package 44071 1727204650.21159: done queuing things up, now waiting for results queue to drain 44071 1727204650.21161: waiting for pending results... 44071 1727204650.21376: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204650.21496: in run() - task 127b8e07-fff9-c964-7471-0000000010ff 44071 1727204650.21510: variable 'ansible_search_path' from source: unknown 44071 1727204650.21514: variable 'ansible_search_path' from source: unknown 44071 1727204650.21551: calling self._execute() 44071 1727204650.21640: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204650.21646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204650.21655: variable 'omit' from source: magic vars 44071 1727204650.21979: variable 'ansible_distribution_major_version' from source: facts 44071 1727204650.21990: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204650.22092: variable 'network_state' from source: role '' defaults 44071 1727204650.22103: Evaluated conditional (network_state != {}): False 44071 1727204650.22106: when evaluation is False, skipping this task 44071 1727204650.22109: _execute() done 44071 1727204650.22112: dumping result to json 44071 1727204650.22115: done dumping result, returning 44071 1727204650.22125: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-c964-7471-0000000010ff] 44071 1727204650.22130: sending task result for task 127b8e07-fff9-c964-7471-0000000010ff 44071 1727204650.22239: done sending task result for task 127b8e07-fff9-c964-7471-0000000010ff 44071 1727204650.22242: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204650.22315: no more pending results, returning what we have 44071 1727204650.22321: results queue empty 44071 1727204650.22322: checking for any_errors_fatal 44071 1727204650.22329: done checking for any_errors_fatal 44071 1727204650.22330: checking for max_fail_percentage 44071 1727204650.22331: done checking for max_fail_percentage 44071 1727204650.22332: checking to see if all hosts have failed and the running result is not ok 44071 1727204650.22333: done checking to see if all hosts have failed 44071 1727204650.22334: getting the remaining hosts for this loop 44071 1727204650.22336: done getting the remaining hosts for this loop 44071 1727204650.22340: getting the next task for host managed-node2 44071 1727204650.22349: done getting next task for host managed-node2 44071 1727204650.22353: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204650.22359: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204650.22382: getting variables 44071 1727204650.22383: in VariableManager get_vars() 44071 1727204650.22422: Calling all_inventory to load vars for managed-node2 44071 1727204650.22424: Calling groups_inventory to load vars for managed-node2 44071 1727204650.22426: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204650.22436: Calling all_plugins_play to load vars for managed-node2 44071 1727204650.22439: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204650.22442: Calling groups_plugins_play to load vars for managed-node2 44071 1727204650.23583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204650.24803: done with get_vars() 44071 1727204650.24833: done getting variables 44071 1727204650.24887: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:04:10 -0400 (0:00:00.040) 0:01:02.565 ***** 44071 1727204650.24917: entering _queue_task() for managed-node2/service 44071 1727204650.25212: worker is 1 (out of 1 available) 44071 1727204650.25228: exiting _queue_task() for managed-node2/service 44071 1727204650.25243: done queuing things up, now waiting for results queue to drain 44071 1727204650.25245: waiting for pending results... 44071 1727204650.25456: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204650.25592: in run() - task 127b8e07-fff9-c964-7471-000000001100 44071 1727204650.25605: variable 'ansible_search_path' from source: unknown 44071 1727204650.25610: variable 'ansible_search_path' from source: unknown 44071 1727204650.25648: calling self._execute() 44071 1727204650.25728: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204650.25736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204650.25745: variable 'omit' from source: magic vars 44071 1727204650.26058: variable 'ansible_distribution_major_version' from source: facts 44071 1727204650.26071: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204650.26164: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204650.26329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204650.28373: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204650.28377: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204650.28380: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204650.28383: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204650.28386: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204650.28418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204650.28479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204650.28512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.28568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204650.28592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204650.28657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204650.28689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204650.28721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.28786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204650.28808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204650.28863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204650.28894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204650.28927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.28979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204650.29000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204650.29208: variable 'network_connections' from source: include params 44071 1727204650.29229: variable 'interface' from source: play vars 44071 1727204650.29315: variable 'interface' from source: play vars 44071 1727204650.29407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204650.29545: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204650.29586: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204650.29613: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204650.29639: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204650.29683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204650.29699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204650.29718: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.29739: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204650.29799: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204650.29981: variable 'network_connections' from source: include params 44071 1727204650.29985: variable 'interface' from source: play vars 44071 1727204650.30042: variable 'interface' from source: play vars 44071 1727204650.30070: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204650.30074: when evaluation is False, skipping this task 44071 1727204650.30077: _execute() done 44071 1727204650.30079: dumping result to json 44071 1727204650.30082: done dumping result, returning 44071 1727204650.30090: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001100] 44071 1727204650.30095: sending task result for task 127b8e07-fff9-c964-7471-000000001100 44071 1727204650.30199: done sending task result for task 127b8e07-fff9-c964-7471-000000001100 44071 1727204650.30209: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204650.30262: no more pending results, returning what we have 44071 1727204650.30267: results queue empty 44071 1727204650.30269: checking for any_errors_fatal 44071 1727204650.30276: done checking for any_errors_fatal 44071 1727204650.30277: checking for max_fail_percentage 44071 1727204650.30279: done checking for max_fail_percentage 44071 1727204650.30280: checking to see if all hosts have failed and the running result is not ok 44071 1727204650.30281: done checking to see if all hosts have failed 44071 1727204650.30281: getting the remaining hosts for this loop 44071 1727204650.30283: done getting the remaining hosts for this loop 44071 1727204650.30288: getting the next task for host managed-node2 44071 1727204650.30296: done getting next task for host managed-node2 44071 1727204650.30301: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204650.30306: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204650.30331: getting variables 44071 1727204650.30332: in VariableManager get_vars() 44071 1727204650.30381: Calling all_inventory to load vars for managed-node2 44071 1727204650.30384: Calling groups_inventory to load vars for managed-node2 44071 1727204650.30386: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204650.30397: Calling all_plugins_play to load vars for managed-node2 44071 1727204650.30400: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204650.30402: Calling groups_plugins_play to load vars for managed-node2 44071 1727204650.31783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204650.34314: done with get_vars() 44071 1727204650.34362: done getting variables 44071 1727204650.34435: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:04:10 -0400 (0:00:00.095) 0:01:02.661 ***** 44071 1727204650.34479: entering _queue_task() for managed-node2/service 44071 1727204650.35000: worker is 1 (out of 1 available) 44071 1727204650.35013: exiting _queue_task() for managed-node2/service 44071 1727204650.35025: done queuing things up, now waiting for results queue to drain 44071 1727204650.35027: waiting for pending results... 44071 1727204650.35448: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204650.35540: in run() - task 127b8e07-fff9-c964-7471-000000001101 44071 1727204650.35544: variable 'ansible_search_path' from source: unknown 44071 1727204650.35547: variable 'ansible_search_path' from source: unknown 44071 1727204650.35575: calling self._execute() 44071 1727204650.35699: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204650.35714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204650.35728: variable 'omit' from source: magic vars 44071 1727204650.36200: variable 'ansible_distribution_major_version' from source: facts 44071 1727204650.36222: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204650.36473: variable 'network_provider' from source: set_fact 44071 1727204650.36476: variable 'network_state' from source: role '' defaults 44071 1727204650.36479: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44071 1727204650.36481: variable 'omit' from source: magic vars 44071 1727204650.36539: variable 'omit' from source: magic vars 44071 1727204650.36572: variable 'network_service_name' from source: role '' defaults 44071 1727204650.36648: variable 'network_service_name' from source: role '' defaults 44071 1727204650.36768: variable '__network_provider_setup' from source: role '' defaults 44071 1727204650.36782: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204650.36958: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204650.36964: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204650.36968: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204650.37245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204650.39921: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204650.40167: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204650.40175: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204650.40177: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204650.40179: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204650.40252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204650.40299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204650.40336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.40390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204650.40419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204650.40481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204650.40519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204650.40554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.40604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204650.40636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204650.40917: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204650.41076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204650.41106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204650.41168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.41197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204650.41218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204650.41330: variable 'ansible_python' from source: facts 44071 1727204650.41384: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204650.41463: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204650.41561: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204650.41726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204650.41818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204650.41822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.41849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204650.41871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204650.41940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204650.41984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204650.42014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.42076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204650.42144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204650.42283: variable 'network_connections' from source: include params 44071 1727204650.42297: variable 'interface' from source: play vars 44071 1727204650.42398: variable 'interface' from source: play vars 44071 1727204650.42538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204650.42776: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204650.42851: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204650.42906: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204650.42972: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204650.43052: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204650.43095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204650.43143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204650.43187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204650.43258: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204650.43609: variable 'network_connections' from source: include params 44071 1727204650.43633: variable 'interface' from source: play vars 44071 1727204650.43740: variable 'interface' from source: play vars 44071 1727204650.43795: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204650.43957: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204650.44261: variable 'network_connections' from source: include params 44071 1727204650.44274: variable 'interface' from source: play vars 44071 1727204650.44363: variable 'interface' from source: play vars 44071 1727204650.44403: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204650.44503: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204650.44873: variable 'network_connections' from source: include params 44071 1727204650.44884: variable 'interface' from source: play vars 44071 1727204650.44968: variable 'interface' from source: play vars 44071 1727204650.45152: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204650.45155: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204650.45157: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204650.45205: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204650.45471: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204650.46069: variable 'network_connections' from source: include params 44071 1727204650.46082: variable 'interface' from source: play vars 44071 1727204650.46159: variable 'interface' from source: play vars 44071 1727204650.46177: variable 'ansible_distribution' from source: facts 44071 1727204650.46185: variable '__network_rh_distros' from source: role '' defaults 44071 1727204650.46195: variable 'ansible_distribution_major_version' from source: facts 44071 1727204650.46223: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204650.46471: variable 'ansible_distribution' from source: facts 44071 1727204650.46476: variable '__network_rh_distros' from source: role '' defaults 44071 1727204650.46478: variable 'ansible_distribution_major_version' from source: facts 44071 1727204650.46480: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204650.46660: variable 'ansible_distribution' from source: facts 44071 1727204650.46673: variable '__network_rh_distros' from source: role '' defaults 44071 1727204650.46682: variable 'ansible_distribution_major_version' from source: facts 44071 1727204650.46728: variable 'network_provider' from source: set_fact 44071 1727204650.46762: variable 'omit' from source: magic vars 44071 1727204650.46971: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204650.46975: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204650.46977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204650.46979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204650.46981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204650.46983: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204650.46986: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204650.46988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204650.47064: Set connection var ansible_connection to ssh 44071 1727204650.47079: Set connection var ansible_timeout to 10 44071 1727204650.47091: Set connection var ansible_pipelining to False 44071 1727204650.47109: Set connection var ansible_shell_type to sh 44071 1727204650.47120: Set connection var ansible_shell_executable to /bin/sh 44071 1727204650.47134: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204650.47169: variable 'ansible_shell_executable' from source: unknown 44071 1727204650.47178: variable 'ansible_connection' from source: unknown 44071 1727204650.47185: variable 'ansible_module_compression' from source: unknown 44071 1727204650.47192: variable 'ansible_shell_type' from source: unknown 44071 1727204650.47198: variable 'ansible_shell_executable' from source: unknown 44071 1727204650.47210: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204650.47224: variable 'ansible_pipelining' from source: unknown 44071 1727204650.47234: variable 'ansible_timeout' from source: unknown 44071 1727204650.47244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204650.47434: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204650.47448: variable 'omit' from source: magic vars 44071 1727204650.47450: starting attempt loop 44071 1727204650.47453: running the handler 44071 1727204650.47524: variable 'ansible_facts' from source: unknown 44071 1727204650.48656: _low_level_execute_command(): starting 44071 1727204650.48673: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204650.49424: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204650.49506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204650.49580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204650.51360: stdout chunk (state=3): >>>/root <<< 44071 1727204650.51574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204650.51579: stdout chunk (state=3): >>><<< 44071 1727204650.51581: stderr chunk (state=3): >>><<< 44071 1727204650.51604: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204650.51624: _low_level_execute_command(): starting 44071 1727204650.51637: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204650.5161037-47409-51832950328465 `" && echo ansible-tmp-1727204650.5161037-47409-51832950328465="` echo /root/.ansible/tmp/ansible-tmp-1727204650.5161037-47409-51832950328465 `" ) && sleep 0' 44071 1727204650.52359: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204650.52386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204650.52402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204650.52427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204650.52447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204650.52551: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204650.52570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204650.52585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204650.52606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204650.52713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204650.54759: stdout chunk (state=3): >>>ansible-tmp-1727204650.5161037-47409-51832950328465=/root/.ansible/tmp/ansible-tmp-1727204650.5161037-47409-51832950328465 <<< 44071 1727204650.54972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204650.54976: stdout chunk (state=3): >>><<< 44071 1727204650.54978: stderr chunk (state=3): >>><<< 44071 1727204650.54994: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204650.5161037-47409-51832950328465=/root/.ansible/tmp/ansible-tmp-1727204650.5161037-47409-51832950328465 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204650.55043: variable 'ansible_module_compression' from source: unknown 44071 1727204650.55112: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44071 1727204650.55371: variable 'ansible_facts' from source: unknown 44071 1727204650.55412: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204650.5161037-47409-51832950328465/AnsiballZ_systemd.py 44071 1727204650.55617: Sending initial data 44071 1727204650.55627: Sent initial data (155 bytes) 44071 1727204650.56340: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204650.56378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204650.56490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204650.56525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204650.56547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204650.56582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204650.56704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204650.58402: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204650.58496: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204650.58605: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpi0434hkw /root/.ansible/tmp/ansible-tmp-1727204650.5161037-47409-51832950328465/AnsiballZ_systemd.py <<< 44071 1727204650.58608: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204650.5161037-47409-51832950328465/AnsiballZ_systemd.py" <<< 44071 1727204650.58681: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpi0434hkw" to remote "/root/.ansible/tmp/ansible-tmp-1727204650.5161037-47409-51832950328465/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204650.5161037-47409-51832950328465/AnsiballZ_systemd.py" <<< 44071 1727204650.60597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204650.60771: stderr chunk (state=3): >>><<< 44071 1727204650.60776: stdout chunk (state=3): >>><<< 44071 1727204650.60778: done transferring module to remote 44071 1727204650.60780: _low_level_execute_command(): starting 44071 1727204650.60783: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204650.5161037-47409-51832950328465/ /root/.ansible/tmp/ansible-tmp-1727204650.5161037-47409-51832950328465/AnsiballZ_systemd.py && sleep 0' 44071 1727204650.61437: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204650.61463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204650.61482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204650.61500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204650.61518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204650.61569: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204650.61648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204650.61677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204650.61703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204650.61808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204650.63790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204650.63804: stdout chunk (state=3): >>><<< 44071 1727204650.63817: stderr chunk (state=3): >>><<< 44071 1727204650.63846: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204650.63855: _low_level_execute_command(): starting 44071 1727204650.63869: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204650.5161037-47409-51832950328465/AnsiballZ_systemd.py && sleep 0' 44071 1727204650.64572: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204650.64592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204650.64608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204650.64636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204650.64654: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204650.64668: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204650.64754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204650.64786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204650.64812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204650.64833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204650.64962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204650.96855: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4513792", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3515187200", "CPUUsageNSec": "1536850000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitC<<< 44071 1727204650.96898: stdout chunk (state=3): >>>ORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44071 1727204650.99073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204650.99078: stdout chunk (state=3): >>><<< 44071 1727204650.99080: stderr chunk (state=3): >>><<< 44071 1727204650.99085: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4513792", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3515187200", "CPUUsageNSec": "1536850000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204650.99205: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204650.5161037-47409-51832950328465/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204650.99225: _low_level_execute_command(): starting 44071 1727204650.99231: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204650.5161037-47409-51832950328465/ > /dev/null 2>&1 && sleep 0' 44071 1727204650.99950: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204650.99960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204650.99976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204650.99992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204651.00004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204651.00011: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204651.00049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204651.00057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204651.00141: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204651.00164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204651.00180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204651.00289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204651.02294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204651.02405: stderr chunk (state=3): >>><<< 44071 1727204651.02409: stdout chunk (state=3): >>><<< 44071 1727204651.02572: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204651.02575: handler run complete 44071 1727204651.02578: attempt loop complete, returning result 44071 1727204651.02580: _execute() done 44071 1727204651.02582: dumping result to json 44071 1727204651.02584: done dumping result, returning 44071 1727204651.02586: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-c964-7471-000000001101] 44071 1727204651.02589: sending task result for task 127b8e07-fff9-c964-7471-000000001101 44071 1727204651.02937: done sending task result for task 127b8e07-fff9-c964-7471-000000001101 44071 1727204651.02942: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204651.03014: no more pending results, returning what we have 44071 1727204651.03018: results queue empty 44071 1727204651.03019: checking for any_errors_fatal 44071 1727204651.03026: done checking for any_errors_fatal 44071 1727204651.03027: checking for max_fail_percentage 44071 1727204651.03028: done checking for max_fail_percentage 44071 1727204651.03030: checking to see if all hosts have failed and the running result is not ok 44071 1727204651.03030: done checking to see if all hosts have failed 44071 1727204651.03034: getting the remaining hosts for this loop 44071 1727204651.03036: done getting the remaining hosts for this loop 44071 1727204651.03041: getting the next task for host managed-node2 44071 1727204651.03050: done getting next task for host managed-node2 44071 1727204651.03055: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204651.03060: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204651.03081: getting variables 44071 1727204651.03083: in VariableManager get_vars() 44071 1727204651.03127: Calling all_inventory to load vars for managed-node2 44071 1727204651.03130: Calling groups_inventory to load vars for managed-node2 44071 1727204651.03385: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204651.03398: Calling all_plugins_play to load vars for managed-node2 44071 1727204651.03401: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204651.03404: Calling groups_plugins_play to load vars for managed-node2 44071 1727204651.05296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204651.07847: done with get_vars() 44071 1727204651.07880: done getting variables 44071 1727204651.07958: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:04:11 -0400 (0:00:00.735) 0:01:03.396 ***** 44071 1727204651.08005: entering _queue_task() for managed-node2/service 44071 1727204651.08535: worker is 1 (out of 1 available) 44071 1727204651.08548: exiting _queue_task() for managed-node2/service 44071 1727204651.08561: done queuing things up, now waiting for results queue to drain 44071 1727204651.08563: waiting for pending results... 44071 1727204651.08825: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204651.09015: in run() - task 127b8e07-fff9-c964-7471-000000001102 44071 1727204651.09155: variable 'ansible_search_path' from source: unknown 44071 1727204651.09159: variable 'ansible_search_path' from source: unknown 44071 1727204651.09162: calling self._execute() 44071 1727204651.09208: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204651.09221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204651.09237: variable 'omit' from source: magic vars 44071 1727204651.09686: variable 'ansible_distribution_major_version' from source: facts 44071 1727204651.09712: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204651.09856: variable 'network_provider' from source: set_fact 44071 1727204651.09869: Evaluated conditional (network_provider == "nm"): True 44071 1727204651.09979: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204651.10087: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204651.10306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204651.12985: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204651.13076: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204651.13271: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204651.13275: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204651.13278: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204651.13323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204651.13364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204651.13408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204651.13460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204651.13483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204651.13553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204651.13588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204651.13628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204651.13682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204651.13704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204651.13770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204651.13801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204651.13842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204651.13942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204651.13946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204651.14101: variable 'network_connections' from source: include params 44071 1727204651.14121: variable 'interface' from source: play vars 44071 1727204651.14213: variable 'interface' from source: play vars 44071 1727204651.14312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204651.14522: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204651.14574: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204651.14617: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204651.14703: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204651.14719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204651.14749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204651.14781: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204651.14819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204651.14881: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204651.15191: variable 'network_connections' from source: include params 44071 1727204651.15203: variable 'interface' from source: play vars 44071 1727204651.15357: variable 'interface' from source: play vars 44071 1727204651.15360: Evaluated conditional (__network_wpa_supplicant_required): False 44071 1727204651.15362: when evaluation is False, skipping this task 44071 1727204651.15365: _execute() done 44071 1727204651.15370: dumping result to json 44071 1727204651.15372: done dumping result, returning 44071 1727204651.15377: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-c964-7471-000000001102] 44071 1727204651.15396: sending task result for task 127b8e07-fff9-c964-7471-000000001102 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44071 1727204651.15567: no more pending results, returning what we have 44071 1727204651.15572: results queue empty 44071 1727204651.15573: checking for any_errors_fatal 44071 1727204651.15606: done checking for any_errors_fatal 44071 1727204651.15607: checking for max_fail_percentage 44071 1727204651.15609: done checking for max_fail_percentage 44071 1727204651.15610: checking to see if all hosts have failed and the running result is not ok 44071 1727204651.15611: done checking to see if all hosts have failed 44071 1727204651.15612: getting the remaining hosts for this loop 44071 1727204651.15615: done getting the remaining hosts for this loop 44071 1727204651.15620: getting the next task for host managed-node2 44071 1727204651.15630: done getting next task for host managed-node2 44071 1727204651.15637: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204651.15643: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204651.15889: getting variables 44071 1727204651.15891: in VariableManager get_vars() 44071 1727204651.15939: Calling all_inventory to load vars for managed-node2 44071 1727204651.15943: Calling groups_inventory to load vars for managed-node2 44071 1727204651.15945: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204651.16003: Calling all_plugins_play to load vars for managed-node2 44071 1727204651.16008: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204651.16012: Calling groups_plugins_play to load vars for managed-node2 44071 1727204651.16685: done sending task result for task 127b8e07-fff9-c964-7471-000000001102 44071 1727204651.16690: WORKER PROCESS EXITING 44071 1727204651.18100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204651.20440: done with get_vars() 44071 1727204651.20493: done getting variables 44071 1727204651.20571: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:04:11 -0400 (0:00:00.126) 0:01:03.522 ***** 44071 1727204651.20610: entering _queue_task() for managed-node2/service 44071 1727204651.21188: worker is 1 (out of 1 available) 44071 1727204651.21199: exiting _queue_task() for managed-node2/service 44071 1727204651.21214: done queuing things up, now waiting for results queue to drain 44071 1727204651.21216: waiting for pending results... 44071 1727204651.21406: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204651.21602: in run() - task 127b8e07-fff9-c964-7471-000000001103 44071 1727204651.21630: variable 'ansible_search_path' from source: unknown 44071 1727204651.21641: variable 'ansible_search_path' from source: unknown 44071 1727204651.21696: calling self._execute() 44071 1727204651.21820: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204651.21841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204651.21858: variable 'omit' from source: magic vars 44071 1727204651.22335: variable 'ansible_distribution_major_version' from source: facts 44071 1727204651.22358: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204651.22508: variable 'network_provider' from source: set_fact 44071 1727204651.22520: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204651.22539: when evaluation is False, skipping this task 44071 1727204651.22548: _execute() done 44071 1727204651.22556: dumping result to json 44071 1727204651.22568: done dumping result, returning 44071 1727204651.22582: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-c964-7471-000000001103] 44071 1727204651.22592: sending task result for task 127b8e07-fff9-c964-7471-000000001103 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204651.22876: no more pending results, returning what we have 44071 1727204651.22881: results queue empty 44071 1727204651.22882: checking for any_errors_fatal 44071 1727204651.22895: done checking for any_errors_fatal 44071 1727204651.22896: checking for max_fail_percentage 44071 1727204651.22898: done checking for max_fail_percentage 44071 1727204651.22899: checking to see if all hosts have failed and the running result is not ok 44071 1727204651.22900: done checking to see if all hosts have failed 44071 1727204651.22900: getting the remaining hosts for this loop 44071 1727204651.22902: done getting the remaining hosts for this loop 44071 1727204651.22908: getting the next task for host managed-node2 44071 1727204651.22919: done getting next task for host managed-node2 44071 1727204651.22923: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204651.22934: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204651.22962: getting variables 44071 1727204651.22964: in VariableManager get_vars() 44071 1727204651.23127: Calling all_inventory to load vars for managed-node2 44071 1727204651.23134: Calling groups_inventory to load vars for managed-node2 44071 1727204651.23137: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204651.23203: Calling all_plugins_play to load vars for managed-node2 44071 1727204651.23207: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204651.23211: Calling groups_plugins_play to load vars for managed-node2 44071 1727204651.23813: done sending task result for task 127b8e07-fff9-c964-7471-000000001103 44071 1727204651.23818: WORKER PROCESS EXITING 44071 1727204651.25398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204651.27772: done with get_vars() 44071 1727204651.27815: done getting variables 44071 1727204651.27892: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:04:11 -0400 (0:00:00.073) 0:01:03.595 ***** 44071 1727204651.27937: entering _queue_task() for managed-node2/copy 44071 1727204651.28571: worker is 1 (out of 1 available) 44071 1727204651.28585: exiting _queue_task() for managed-node2/copy 44071 1727204651.28598: done queuing things up, now waiting for results queue to drain 44071 1727204651.28600: waiting for pending results... 44071 1727204651.28763: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204651.28955: in run() - task 127b8e07-fff9-c964-7471-000000001104 44071 1727204651.28983: variable 'ansible_search_path' from source: unknown 44071 1727204651.28992: variable 'ansible_search_path' from source: unknown 44071 1727204651.29046: calling self._execute() 44071 1727204651.29175: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204651.29190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204651.29265: variable 'omit' from source: magic vars 44071 1727204651.29682: variable 'ansible_distribution_major_version' from source: facts 44071 1727204651.29713: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204651.29860: variable 'network_provider' from source: set_fact 44071 1727204651.29875: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204651.29883: when evaluation is False, skipping this task 44071 1727204651.29891: _execute() done 44071 1727204651.29900: dumping result to json 44071 1727204651.29909: done dumping result, returning 44071 1727204651.29935: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-c964-7471-000000001104] 44071 1727204651.29972: sending task result for task 127b8e07-fff9-c964-7471-000000001104 44071 1727204651.30217: done sending task result for task 127b8e07-fff9-c964-7471-000000001104 44071 1727204651.30221: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44071 1727204651.30287: no more pending results, returning what we have 44071 1727204651.30292: results queue empty 44071 1727204651.30293: checking for any_errors_fatal 44071 1727204651.30302: done checking for any_errors_fatal 44071 1727204651.30303: checking for max_fail_percentage 44071 1727204651.30304: done checking for max_fail_percentage 44071 1727204651.30305: checking to see if all hosts have failed and the running result is not ok 44071 1727204651.30306: done checking to see if all hosts have failed 44071 1727204651.30307: getting the remaining hosts for this loop 44071 1727204651.30309: done getting the remaining hosts for this loop 44071 1727204651.30314: getting the next task for host managed-node2 44071 1727204651.30325: done getting next task for host managed-node2 44071 1727204651.30330: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204651.30340: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204651.30499: getting variables 44071 1727204651.30503: in VariableManager get_vars() 44071 1727204651.30548: Calling all_inventory to load vars for managed-node2 44071 1727204651.30552: Calling groups_inventory to load vars for managed-node2 44071 1727204651.30555: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204651.30611: Calling all_plugins_play to load vars for managed-node2 44071 1727204651.30615: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204651.30620: Calling groups_plugins_play to load vars for managed-node2 44071 1727204651.32579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204651.35120: done with get_vars() 44071 1727204651.35153: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:04:11 -0400 (0:00:00.073) 0:01:03.669 ***** 44071 1727204651.35259: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204651.35686: worker is 1 (out of 1 available) 44071 1727204651.35701: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204651.35719: done queuing things up, now waiting for results queue to drain 44071 1727204651.35721: waiting for pending results... 44071 1727204651.36101: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204651.36278: in run() - task 127b8e07-fff9-c964-7471-000000001105 44071 1727204651.36372: variable 'ansible_search_path' from source: unknown 44071 1727204651.36377: variable 'ansible_search_path' from source: unknown 44071 1727204651.36380: calling self._execute() 44071 1727204651.36491: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204651.36505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204651.36529: variable 'omit' from source: magic vars 44071 1727204651.36991: variable 'ansible_distribution_major_version' from source: facts 44071 1727204651.37015: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204651.37028: variable 'omit' from source: magic vars 44071 1727204651.37120: variable 'omit' from source: magic vars 44071 1727204651.37374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204651.48778: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204651.48874: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204651.48918: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204651.48967: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204651.49171: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204651.49174: variable 'network_provider' from source: set_fact 44071 1727204651.49235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204651.49272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204651.49310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204651.49358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204651.49380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204651.49478: variable 'omit' from source: magic vars 44071 1727204651.49623: variable 'omit' from source: magic vars 44071 1727204651.49753: variable 'network_connections' from source: include params 44071 1727204651.49772: variable 'interface' from source: play vars 44071 1727204651.49850: variable 'interface' from source: play vars 44071 1727204651.50019: variable 'omit' from source: magic vars 44071 1727204651.50036: variable '__lsr_ansible_managed' from source: task vars 44071 1727204651.50111: variable '__lsr_ansible_managed' from source: task vars 44071 1727204651.50323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44071 1727204651.50581: Loaded config def from plugin (lookup/template) 44071 1727204651.50671: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44071 1727204651.50675: File lookup term: get_ansible_managed.j2 44071 1727204651.50680: variable 'ansible_search_path' from source: unknown 44071 1727204651.50684: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44071 1727204651.50688: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44071 1727204651.50692: variable 'ansible_search_path' from source: unknown 44071 1727204651.59070: variable 'ansible_managed' from source: unknown 44071 1727204651.59301: variable 'omit' from source: magic vars 44071 1727204651.59342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204651.59383: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204651.59404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204651.59473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204651.59478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204651.59481: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204651.59483: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204651.59486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204651.59676: Set connection var ansible_connection to ssh 44071 1727204651.59679: Set connection var ansible_timeout to 10 44071 1727204651.59681: Set connection var ansible_pipelining to False 44071 1727204651.59683: Set connection var ansible_shell_type to sh 44071 1727204651.59687: Set connection var ansible_shell_executable to /bin/sh 44071 1727204651.59689: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204651.59692: variable 'ansible_shell_executable' from source: unknown 44071 1727204651.59694: variable 'ansible_connection' from source: unknown 44071 1727204651.59711: variable 'ansible_module_compression' from source: unknown 44071 1727204651.59720: variable 'ansible_shell_type' from source: unknown 44071 1727204651.59728: variable 'ansible_shell_executable' from source: unknown 44071 1727204651.59738: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204651.59809: variable 'ansible_pipelining' from source: unknown 44071 1727204651.59813: variable 'ansible_timeout' from source: unknown 44071 1727204651.59817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204651.59921: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204651.60027: variable 'omit' from source: magic vars 44071 1727204651.60033: starting attempt loop 44071 1727204651.60038: running the handler 44071 1727204651.60040: _low_level_execute_command(): starting 44071 1727204651.60043: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204651.60823: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204651.60925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204651.60950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204651.60974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204651.60997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204651.61126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204651.62911: stdout chunk (state=3): >>>/root <<< 44071 1727204651.63139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204651.63143: stdout chunk (state=3): >>><<< 44071 1727204651.63146: stderr chunk (state=3): >>><<< 44071 1727204651.63172: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204651.63285: _low_level_execute_command(): starting 44071 1727204651.63290: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204651.6317987-47487-160158489818780 `" && echo ansible-tmp-1727204651.6317987-47487-160158489818780="` echo /root/.ansible/tmp/ansible-tmp-1727204651.6317987-47487-160158489818780 `" ) && sleep 0' 44071 1727204651.63936: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204651.63958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204651.64086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204651.64111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204651.64221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204651.66205: stdout chunk (state=3): >>>ansible-tmp-1727204651.6317987-47487-160158489818780=/root/.ansible/tmp/ansible-tmp-1727204651.6317987-47487-160158489818780 <<< 44071 1727204651.66430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204651.66437: stdout chunk (state=3): >>><<< 44071 1727204651.66439: stderr chunk (state=3): >>><<< 44071 1727204651.66572: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204651.6317987-47487-160158489818780=/root/.ansible/tmp/ansible-tmp-1727204651.6317987-47487-160158489818780 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204651.66575: variable 'ansible_module_compression' from source: unknown 44071 1727204651.66578: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44071 1727204651.66625: variable 'ansible_facts' from source: unknown 44071 1727204651.66751: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204651.6317987-47487-160158489818780/AnsiballZ_network_connections.py 44071 1727204651.66929: Sending initial data 44071 1727204651.66942: Sent initial data (168 bytes) 44071 1727204651.67607: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204651.67686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204651.67753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204651.67775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204651.67797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204651.67902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204651.69515: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204651.69563: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204651.69625: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204651.69722: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpeb4mwf01 /root/.ansible/tmp/ansible-tmp-1727204651.6317987-47487-160158489818780/AnsiballZ_network_connections.py <<< 44071 1727204651.69726: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204651.6317987-47487-160158489818780/AnsiballZ_network_connections.py" <<< 44071 1727204651.69814: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpeb4mwf01" to remote "/root/.ansible/tmp/ansible-tmp-1727204651.6317987-47487-160158489818780/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204651.6317987-47487-160158489818780/AnsiballZ_network_connections.py" <<< 44071 1727204651.71087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204651.71091: stderr chunk (state=3): >>><<< 44071 1727204651.71096: stdout chunk (state=3): >>><<< 44071 1727204651.71118: done transferring module to remote 44071 1727204651.71137: _low_level_execute_command(): starting 44071 1727204651.71141: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204651.6317987-47487-160158489818780/ /root/.ansible/tmp/ansible-tmp-1727204651.6317987-47487-160158489818780/AnsiballZ_network_connections.py && sleep 0' 44071 1727204651.71644: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204651.71649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204651.71652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204651.71708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204651.71712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204651.71716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204651.71792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204651.73638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204651.73705: stderr chunk (state=3): >>><<< 44071 1727204651.73709: stdout chunk (state=3): >>><<< 44071 1727204651.73723: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204651.73726: _low_level_execute_command(): starting 44071 1727204651.73732: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204651.6317987-47487-160158489818780/AnsiballZ_network_connections.py && sleep 0' 44071 1727204651.74233: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204651.74237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204651.74240: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204651.74242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204651.74308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204651.74312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204651.74314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204651.74392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204652.06220: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, beccd2e1-72f3-4d73-aac6-77978c2859f8\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44071 1727204652.08714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204652.08779: stderr chunk (state=3): >>><<< 44071 1727204652.08783: stdout chunk (state=3): >>><<< 44071 1727204652.08801: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, beccd2e1-72f3-4d73-aac6-77978c2859f8\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204652.08834: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204651.6317987-47487-160158489818780/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204652.08844: _low_level_execute_command(): starting 44071 1727204652.08849: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204651.6317987-47487-160158489818780/ > /dev/null 2>&1 && sleep 0' 44071 1727204652.09506: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204652.09576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204652.11653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204652.11780: stderr chunk (state=3): >>><<< 44071 1727204652.11784: stdout chunk (state=3): >>><<< 44071 1727204652.11809: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204652.11816: handler run complete 44071 1727204652.11891: attempt loop complete, returning result 44071 1727204652.11895: _execute() done 44071 1727204652.11900: dumping result to json 44071 1727204652.11906: done dumping result, returning 44071 1727204652.11921: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-c964-7471-000000001105] 44071 1727204652.11947: sending task result for task 127b8e07-fff9-c964-7471-000000001105 44071 1727204652.12061: done sending task result for task 127b8e07-fff9-c964-7471-000000001105 44071 1727204652.12067: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, beccd2e1-72f3-4d73-aac6-77978c2859f8 44071 1727204652.12176: no more pending results, returning what we have 44071 1727204652.12179: results queue empty 44071 1727204652.12180: checking for any_errors_fatal 44071 1727204652.12187: done checking for any_errors_fatal 44071 1727204652.12188: checking for max_fail_percentage 44071 1727204652.12189: done checking for max_fail_percentage 44071 1727204652.12195: checking to see if all hosts have failed and the running result is not ok 44071 1727204652.12196: done checking to see if all hosts have failed 44071 1727204652.12197: getting the remaining hosts for this loop 44071 1727204652.12198: done getting the remaining hosts for this loop 44071 1727204652.12202: getting the next task for host managed-node2 44071 1727204652.12210: done getting next task for host managed-node2 44071 1727204652.12214: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204652.12218: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204652.12230: getting variables 44071 1727204652.12232: in VariableManager get_vars() 44071 1727204652.12277: Calling all_inventory to load vars for managed-node2 44071 1727204652.12280: Calling groups_inventory to load vars for managed-node2 44071 1727204652.12282: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204652.12293: Calling all_plugins_play to load vars for managed-node2 44071 1727204652.12296: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204652.12299: Calling groups_plugins_play to load vars for managed-node2 44071 1727204652.21512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204652.23129: done with get_vars() 44071 1727204652.23170: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.880) 0:01:04.549 ***** 44071 1727204652.23278: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204652.23606: worker is 1 (out of 1 available) 44071 1727204652.23623: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204652.23639: done queuing things up, now waiting for results queue to drain 44071 1727204652.23642: waiting for pending results... 44071 1727204652.23865: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204652.23990: in run() - task 127b8e07-fff9-c964-7471-000000001106 44071 1727204652.24005: variable 'ansible_search_path' from source: unknown 44071 1727204652.24009: variable 'ansible_search_path' from source: unknown 44071 1727204652.24049: calling self._execute() 44071 1727204652.24141: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204652.24146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204652.24156: variable 'omit' from source: magic vars 44071 1727204652.24493: variable 'ansible_distribution_major_version' from source: facts 44071 1727204652.24503: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204652.24608: variable 'network_state' from source: role '' defaults 44071 1727204652.24619: Evaluated conditional (network_state != {}): False 44071 1727204652.24622: when evaluation is False, skipping this task 44071 1727204652.24626: _execute() done 44071 1727204652.24632: dumping result to json 44071 1727204652.24635: done dumping result, returning 44071 1727204652.24647: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-c964-7471-000000001106] 44071 1727204652.24650: sending task result for task 127b8e07-fff9-c964-7471-000000001106 44071 1727204652.24761: done sending task result for task 127b8e07-fff9-c964-7471-000000001106 44071 1727204652.24767: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204652.24824: no more pending results, returning what we have 44071 1727204652.24829: results queue empty 44071 1727204652.24830: checking for any_errors_fatal 44071 1727204652.24846: done checking for any_errors_fatal 44071 1727204652.24847: checking for max_fail_percentage 44071 1727204652.24849: done checking for max_fail_percentage 44071 1727204652.24850: checking to see if all hosts have failed and the running result is not ok 44071 1727204652.24851: done checking to see if all hosts have failed 44071 1727204652.24852: getting the remaining hosts for this loop 44071 1727204652.24854: done getting the remaining hosts for this loop 44071 1727204652.24858: getting the next task for host managed-node2 44071 1727204652.24875: done getting next task for host managed-node2 44071 1727204652.24880: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204652.24886: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204652.24909: getting variables 44071 1727204652.24910: in VariableManager get_vars() 44071 1727204652.24953: Calling all_inventory to load vars for managed-node2 44071 1727204652.24956: Calling groups_inventory to load vars for managed-node2 44071 1727204652.24958: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204652.24971: Calling all_plugins_play to load vars for managed-node2 44071 1727204652.24974: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204652.24984: Calling groups_plugins_play to load vars for managed-node2 44071 1727204652.26122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204652.28062: done with get_vars() 44071 1727204652.28095: done getting variables 44071 1727204652.28162: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.049) 0:01:04.598 ***** 44071 1727204652.28207: entering _queue_task() for managed-node2/debug 44071 1727204652.28560: worker is 1 (out of 1 available) 44071 1727204652.28580: exiting _queue_task() for managed-node2/debug 44071 1727204652.28596: done queuing things up, now waiting for results queue to drain 44071 1727204652.28598: waiting for pending results... 44071 1727204652.28908: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204652.29102: in run() - task 127b8e07-fff9-c964-7471-000000001107 44071 1727204652.29109: variable 'ansible_search_path' from source: unknown 44071 1727204652.29112: variable 'ansible_search_path' from source: unknown 44071 1727204652.29144: calling self._execute() 44071 1727204652.29253: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204652.29258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204652.29262: variable 'omit' from source: magic vars 44071 1727204652.29621: variable 'ansible_distribution_major_version' from source: facts 44071 1727204652.29636: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204652.29640: variable 'omit' from source: magic vars 44071 1727204652.29707: variable 'omit' from source: magic vars 44071 1727204652.29737: variable 'omit' from source: magic vars 44071 1727204652.29774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204652.29809: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204652.29826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204652.29842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204652.29853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204652.29882: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204652.29886: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204652.29888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204652.29968: Set connection var ansible_connection to ssh 44071 1727204652.29975: Set connection var ansible_timeout to 10 44071 1727204652.29996: Set connection var ansible_pipelining to False 44071 1727204652.30001: Set connection var ansible_shell_type to sh 44071 1727204652.30007: Set connection var ansible_shell_executable to /bin/sh 44071 1727204652.30017: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204652.30038: variable 'ansible_shell_executable' from source: unknown 44071 1727204652.30041: variable 'ansible_connection' from source: unknown 44071 1727204652.30044: variable 'ansible_module_compression' from source: unknown 44071 1727204652.30047: variable 'ansible_shell_type' from source: unknown 44071 1727204652.30050: variable 'ansible_shell_executable' from source: unknown 44071 1727204652.30052: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204652.30056: variable 'ansible_pipelining' from source: unknown 44071 1727204652.30059: variable 'ansible_timeout' from source: unknown 44071 1727204652.30063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204652.30182: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204652.30194: variable 'omit' from source: magic vars 44071 1727204652.30197: starting attempt loop 44071 1727204652.30200: running the handler 44071 1727204652.30314: variable '__network_connections_result' from source: set_fact 44071 1727204652.30380: handler run complete 44071 1727204652.30391: attempt loop complete, returning result 44071 1727204652.30394: _execute() done 44071 1727204652.30397: dumping result to json 44071 1727204652.30399: done dumping result, returning 44071 1727204652.30409: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-c964-7471-000000001107] 44071 1727204652.30415: sending task result for task 127b8e07-fff9-c964-7471-000000001107 44071 1727204652.30519: done sending task result for task 127b8e07-fff9-c964-7471-000000001107 44071 1727204652.30522: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, beccd2e1-72f3-4d73-aac6-77978c2859f8" ] } 44071 1727204652.30606: no more pending results, returning what we have 44071 1727204652.30610: results queue empty 44071 1727204652.30611: checking for any_errors_fatal 44071 1727204652.30618: done checking for any_errors_fatal 44071 1727204652.30619: checking for max_fail_percentage 44071 1727204652.30620: done checking for max_fail_percentage 44071 1727204652.30621: checking to see if all hosts have failed and the running result is not ok 44071 1727204652.30622: done checking to see if all hosts have failed 44071 1727204652.30622: getting the remaining hosts for this loop 44071 1727204652.30624: done getting the remaining hosts for this loop 44071 1727204652.30629: getting the next task for host managed-node2 44071 1727204652.30642: done getting next task for host managed-node2 44071 1727204652.30647: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204652.30652: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204652.30670: getting variables 44071 1727204652.30672: in VariableManager get_vars() 44071 1727204652.30714: Calling all_inventory to load vars for managed-node2 44071 1727204652.30717: Calling groups_inventory to load vars for managed-node2 44071 1727204652.30719: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204652.30729: Calling all_plugins_play to load vars for managed-node2 44071 1727204652.30734: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204652.30737: Calling groups_plugins_play to load vars for managed-node2 44071 1727204652.32037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204652.33576: done with get_vars() 44071 1727204652.33598: done getting variables 44071 1727204652.33649: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.054) 0:01:04.653 ***** 44071 1727204652.33687: entering _queue_task() for managed-node2/debug 44071 1727204652.33997: worker is 1 (out of 1 available) 44071 1727204652.34013: exiting _queue_task() for managed-node2/debug 44071 1727204652.34027: done queuing things up, now waiting for results queue to drain 44071 1727204652.34029: waiting for pending results... 44071 1727204652.34304: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204652.34417: in run() - task 127b8e07-fff9-c964-7471-000000001108 44071 1727204652.34448: variable 'ansible_search_path' from source: unknown 44071 1727204652.34456: variable 'ansible_search_path' from source: unknown 44071 1727204652.34525: calling self._execute() 44071 1727204652.34588: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204652.34594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204652.34605: variable 'omit' from source: magic vars 44071 1727204652.35015: variable 'ansible_distribution_major_version' from source: facts 44071 1727204652.35020: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204652.35042: variable 'omit' from source: magic vars 44071 1727204652.35095: variable 'omit' from source: magic vars 44071 1727204652.35127: variable 'omit' from source: magic vars 44071 1727204652.35180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204652.35217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204652.35235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204652.35251: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204652.35263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204652.35290: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204652.35293: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204652.35298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204652.35430: Set connection var ansible_connection to ssh 44071 1727204652.35436: Set connection var ansible_timeout to 10 44071 1727204652.35439: Set connection var ansible_pipelining to False 44071 1727204652.35441: Set connection var ansible_shell_type to sh 44071 1727204652.35443: Set connection var ansible_shell_executable to /bin/sh 44071 1727204652.35446: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204652.35472: variable 'ansible_shell_executable' from source: unknown 44071 1727204652.35475: variable 'ansible_connection' from source: unknown 44071 1727204652.35483: variable 'ansible_module_compression' from source: unknown 44071 1727204652.35486: variable 'ansible_shell_type' from source: unknown 44071 1727204652.35489: variable 'ansible_shell_executable' from source: unknown 44071 1727204652.35491: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204652.35493: variable 'ansible_pipelining' from source: unknown 44071 1727204652.35495: variable 'ansible_timeout' from source: unknown 44071 1727204652.35498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204652.35668: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204652.35672: variable 'omit' from source: magic vars 44071 1727204652.35675: starting attempt loop 44071 1727204652.35678: running the handler 44071 1727204652.35739: variable '__network_connections_result' from source: set_fact 44071 1727204652.35826: variable '__network_connections_result' from source: set_fact 44071 1727204652.35958: handler run complete 44071 1727204652.35983: attempt loop complete, returning result 44071 1727204652.35987: _execute() done 44071 1727204652.35990: dumping result to json 44071 1727204652.35993: done dumping result, returning 44071 1727204652.36001: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-c964-7471-000000001108] 44071 1727204652.36006: sending task result for task 127b8e07-fff9-c964-7471-000000001108 44071 1727204652.36194: done sending task result for task 127b8e07-fff9-c964-7471-000000001108 44071 1727204652.36198: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, beccd2e1-72f3-4d73-aac6-77978c2859f8\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, beccd2e1-72f3-4d73-aac6-77978c2859f8" ] } } 44071 1727204652.36316: no more pending results, returning what we have 44071 1727204652.36320: results queue empty 44071 1727204652.36320: checking for any_errors_fatal 44071 1727204652.36327: done checking for any_errors_fatal 44071 1727204652.36328: checking for max_fail_percentage 44071 1727204652.36329: done checking for max_fail_percentage 44071 1727204652.36330: checking to see if all hosts have failed and the running result is not ok 44071 1727204652.36333: done checking to see if all hosts have failed 44071 1727204652.36334: getting the remaining hosts for this loop 44071 1727204652.36335: done getting the remaining hosts for this loop 44071 1727204652.36338: getting the next task for host managed-node2 44071 1727204652.36346: done getting next task for host managed-node2 44071 1727204652.36350: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204652.36354: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204652.36371: getting variables 44071 1727204652.36373: in VariableManager get_vars() 44071 1727204652.36444: Calling all_inventory to load vars for managed-node2 44071 1727204652.36446: Calling groups_inventory to load vars for managed-node2 44071 1727204652.36448: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204652.36460: Calling all_plugins_play to load vars for managed-node2 44071 1727204652.36464: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204652.36470: Calling groups_plugins_play to load vars for managed-node2 44071 1727204652.37790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204652.39592: done with get_vars() 44071 1727204652.39624: done getting variables 44071 1727204652.39676: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.060) 0:01:04.713 ***** 44071 1727204652.39708: entering _queue_task() for managed-node2/debug 44071 1727204652.40012: worker is 1 (out of 1 available) 44071 1727204652.40027: exiting _queue_task() for managed-node2/debug 44071 1727204652.40042: done queuing things up, now waiting for results queue to drain 44071 1727204652.40044: waiting for pending results... 44071 1727204652.40274: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204652.40401: in run() - task 127b8e07-fff9-c964-7471-000000001109 44071 1727204652.40416: variable 'ansible_search_path' from source: unknown 44071 1727204652.40419: variable 'ansible_search_path' from source: unknown 44071 1727204652.40454: calling self._execute() 44071 1727204652.40576: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204652.40597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204652.40601: variable 'omit' from source: magic vars 44071 1727204652.41074: variable 'ansible_distribution_major_version' from source: facts 44071 1727204652.41101: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204652.41218: variable 'network_state' from source: role '' defaults 44071 1727204652.41228: Evaluated conditional (network_state != {}): False 44071 1727204652.41231: when evaluation is False, skipping this task 44071 1727204652.41237: _execute() done 44071 1727204652.41240: dumping result to json 44071 1727204652.41244: done dumping result, returning 44071 1727204652.41253: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-c964-7471-000000001109] 44071 1727204652.41258: sending task result for task 127b8e07-fff9-c964-7471-000000001109 44071 1727204652.41394: done sending task result for task 127b8e07-fff9-c964-7471-000000001109 44071 1727204652.41398: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 44071 1727204652.41455: no more pending results, returning what we have 44071 1727204652.41459: results queue empty 44071 1727204652.41460: checking for any_errors_fatal 44071 1727204652.41476: done checking for any_errors_fatal 44071 1727204652.41477: checking for max_fail_percentage 44071 1727204652.41479: done checking for max_fail_percentage 44071 1727204652.41480: checking to see if all hosts have failed and the running result is not ok 44071 1727204652.41481: done checking to see if all hosts have failed 44071 1727204652.41482: getting the remaining hosts for this loop 44071 1727204652.41483: done getting the remaining hosts for this loop 44071 1727204652.41488: getting the next task for host managed-node2 44071 1727204652.41497: done getting next task for host managed-node2 44071 1727204652.41501: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204652.41510: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204652.41541: getting variables 44071 1727204652.41543: in VariableManager get_vars() 44071 1727204652.41629: Calling all_inventory to load vars for managed-node2 44071 1727204652.41633: Calling groups_inventory to load vars for managed-node2 44071 1727204652.41635: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204652.41648: Calling all_plugins_play to load vars for managed-node2 44071 1727204652.41651: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204652.41653: Calling groups_plugins_play to load vars for managed-node2 44071 1727204652.43170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204652.44695: done with get_vars() 44071 1727204652.44722: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.051) 0:01:04.764 ***** 44071 1727204652.44834: entering _queue_task() for managed-node2/ping 44071 1727204652.45163: worker is 1 (out of 1 available) 44071 1727204652.45179: exiting _queue_task() for managed-node2/ping 44071 1727204652.45195: done queuing things up, now waiting for results queue to drain 44071 1727204652.45197: waiting for pending results... 44071 1727204652.45449: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204652.45565: in run() - task 127b8e07-fff9-c964-7471-00000000110a 44071 1727204652.45581: variable 'ansible_search_path' from source: unknown 44071 1727204652.45586: variable 'ansible_search_path' from source: unknown 44071 1727204652.45621: calling self._execute() 44071 1727204652.45706: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204652.45712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204652.45720: variable 'omit' from source: magic vars 44071 1727204652.46058: variable 'ansible_distribution_major_version' from source: facts 44071 1727204652.46071: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204652.46078: variable 'omit' from source: magic vars 44071 1727204652.46124: variable 'omit' from source: magic vars 44071 1727204652.46153: variable 'omit' from source: magic vars 44071 1727204652.46192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204652.46224: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204652.46245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204652.46259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204652.46275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204652.46300: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204652.46303: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204652.46306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204652.46388: Set connection var ansible_connection to ssh 44071 1727204652.46394: Set connection var ansible_timeout to 10 44071 1727204652.46399: Set connection var ansible_pipelining to False 44071 1727204652.46422: Set connection var ansible_shell_type to sh 44071 1727204652.46427: Set connection var ansible_shell_executable to /bin/sh 44071 1727204652.46455: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204652.46468: variable 'ansible_shell_executable' from source: unknown 44071 1727204652.46471: variable 'ansible_connection' from source: unknown 44071 1727204652.46474: variable 'ansible_module_compression' from source: unknown 44071 1727204652.46477: variable 'ansible_shell_type' from source: unknown 44071 1727204652.46479: variable 'ansible_shell_executable' from source: unknown 44071 1727204652.46481: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204652.46483: variable 'ansible_pipelining' from source: unknown 44071 1727204652.46487: variable 'ansible_timeout' from source: unknown 44071 1727204652.46515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204652.46712: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204652.46717: variable 'omit' from source: magic vars 44071 1727204652.46728: starting attempt loop 44071 1727204652.46731: running the handler 44071 1727204652.46743: _low_level_execute_command(): starting 44071 1727204652.46750: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204652.47436: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204652.47443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204652.47460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204652.47486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204652.47504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204652.47585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204652.49367: stdout chunk (state=3): >>>/root <<< 44071 1727204652.49486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204652.49587: stderr chunk (state=3): >>><<< 44071 1727204652.49591: stdout chunk (state=3): >>><<< 44071 1727204652.49647: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204652.49650: _low_level_execute_command(): starting 44071 1727204652.49672: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204652.496245-47537-186839842665731 `" && echo ansible-tmp-1727204652.496245-47537-186839842665731="` echo /root/.ansible/tmp/ansible-tmp-1727204652.496245-47537-186839842665731 `" ) && sleep 0' 44071 1727204652.50354: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204652.50360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204652.50388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204652.50474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204652.52522: stdout chunk (state=3): >>>ansible-tmp-1727204652.496245-47537-186839842665731=/root/.ansible/tmp/ansible-tmp-1727204652.496245-47537-186839842665731 <<< 44071 1727204652.52643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204652.52707: stderr chunk (state=3): >>><<< 44071 1727204652.52711: stdout chunk (state=3): >>><<< 44071 1727204652.52727: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204652.496245-47537-186839842665731=/root/.ansible/tmp/ansible-tmp-1727204652.496245-47537-186839842665731 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204652.52775: variable 'ansible_module_compression' from source: unknown 44071 1727204652.52811: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44071 1727204652.52847: variable 'ansible_facts' from source: unknown 44071 1727204652.52902: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204652.496245-47537-186839842665731/AnsiballZ_ping.py 44071 1727204652.53016: Sending initial data 44071 1727204652.53020: Sent initial data (152 bytes) 44071 1727204652.53527: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204652.53531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204652.53534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204652.53536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204652.53539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204652.53592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204652.53596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204652.53600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204652.53676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204652.55302: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204652.55364: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204652.55436: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbsfvpfno /root/.ansible/tmp/ansible-tmp-1727204652.496245-47537-186839842665731/AnsiballZ_ping.py <<< 44071 1727204652.55443: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204652.496245-47537-186839842665731/AnsiballZ_ping.py" <<< 44071 1727204652.55506: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbsfvpfno" to remote "/root/.ansible/tmp/ansible-tmp-1727204652.496245-47537-186839842665731/AnsiballZ_ping.py" <<< 44071 1727204652.55509: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204652.496245-47537-186839842665731/AnsiballZ_ping.py" <<< 44071 1727204652.56171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204652.56248: stderr chunk (state=3): >>><<< 44071 1727204652.56252: stdout chunk (state=3): >>><<< 44071 1727204652.56278: done transferring module to remote 44071 1727204652.56289: _low_level_execute_command(): starting 44071 1727204652.56298: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204652.496245-47537-186839842665731/ /root/.ansible/tmp/ansible-tmp-1727204652.496245-47537-186839842665731/AnsiballZ_ping.py && sleep 0' 44071 1727204652.56763: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204652.56794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204652.56798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204652.56801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204652.56859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204652.56862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204652.56935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204652.58817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204652.58873: stderr chunk (state=3): >>><<< 44071 1727204652.58877: stdout chunk (state=3): >>><<< 44071 1727204652.58891: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204652.58896: _low_level_execute_command(): starting 44071 1727204652.58903: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204652.496245-47537-186839842665731/AnsiballZ_ping.py && sleep 0' 44071 1727204652.59407: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204652.59411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204652.59414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204652.59416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204652.59472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204652.59481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204652.59556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204652.76072: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44071 1727204652.77603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204652.77609: stdout chunk (state=3): >>><<< 44071 1727204652.77611: stderr chunk (state=3): >>><<< 44071 1727204652.77769: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204652.77775: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204652.496245-47537-186839842665731/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204652.77778: _low_level_execute_command(): starting 44071 1727204652.77781: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204652.496245-47537-186839842665731/ > /dev/null 2>&1 && sleep 0' 44071 1727204652.78444: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204652.78562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204652.78589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204652.78613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204652.78634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204652.78660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204652.78781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204652.80864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204652.80890: stdout chunk (state=3): >>><<< 44071 1727204652.80910: stderr chunk (state=3): >>><<< 44071 1727204652.81072: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204652.81082: handler run complete 44071 1727204652.81085: attempt loop complete, returning result 44071 1727204652.81088: _execute() done 44071 1727204652.81090: dumping result to json 44071 1727204652.81093: done dumping result, returning 44071 1727204652.81095: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-c964-7471-00000000110a] 44071 1727204652.81098: sending task result for task 127b8e07-fff9-c964-7471-00000000110a 44071 1727204652.81177: done sending task result for task 127b8e07-fff9-c964-7471-00000000110a 44071 1727204652.81181: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 44071 1727204652.81267: no more pending results, returning what we have 44071 1727204652.81272: results queue empty 44071 1727204652.81273: checking for any_errors_fatal 44071 1727204652.81283: done checking for any_errors_fatal 44071 1727204652.81284: checking for max_fail_percentage 44071 1727204652.81285: done checking for max_fail_percentage 44071 1727204652.81287: checking to see if all hosts have failed and the running result is not ok 44071 1727204652.81287: done checking to see if all hosts have failed 44071 1727204652.81288: getting the remaining hosts for this loop 44071 1727204652.81297: done getting the remaining hosts for this loop 44071 1727204652.81303: getting the next task for host managed-node2 44071 1727204652.81315: done getting next task for host managed-node2 44071 1727204652.81318: ^ task is: TASK: meta (role_complete) 44071 1727204652.81326: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204652.81344: getting variables 44071 1727204652.81346: in VariableManager get_vars() 44071 1727204652.81548: Calling all_inventory to load vars for managed-node2 44071 1727204652.81551: Calling groups_inventory to load vars for managed-node2 44071 1727204652.81554: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204652.81569: Calling all_plugins_play to load vars for managed-node2 44071 1727204652.81573: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204652.81577: Calling groups_plugins_play to load vars for managed-node2 44071 1727204652.83716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204652.86356: done with get_vars() 44071 1727204652.86400: done getting variables 44071 1727204652.86513: done queuing things up, now waiting for results queue to drain 44071 1727204652.86516: results queue empty 44071 1727204652.86517: checking for any_errors_fatal 44071 1727204652.86520: done checking for any_errors_fatal 44071 1727204652.86521: checking for max_fail_percentage 44071 1727204652.86522: done checking for max_fail_percentage 44071 1727204652.86523: checking to see if all hosts have failed and the running result is not ok 44071 1727204652.86524: done checking to see if all hosts have failed 44071 1727204652.86525: getting the remaining hosts for this loop 44071 1727204652.86526: done getting the remaining hosts for this loop 44071 1727204652.86529: getting the next task for host managed-node2 44071 1727204652.86536: done getting next task for host managed-node2 44071 1727204652.86539: ^ task is: TASK: Show result 44071 1727204652.86542: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204652.86545: getting variables 44071 1727204652.86546: in VariableManager get_vars() 44071 1727204652.86558: Calling all_inventory to load vars for managed-node2 44071 1727204652.86561: Calling groups_inventory to load vars for managed-node2 44071 1727204652.86564: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204652.86573: Calling all_plugins_play to load vars for managed-node2 44071 1727204652.86575: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204652.86579: Calling groups_plugins_play to load vars for managed-node2 44071 1727204652.88290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204652.90723: done with get_vars() 44071 1727204652.90762: done getting variables 44071 1727204652.90821: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.460) 0:01:05.225 ***** 44071 1727204652.90855: entering _queue_task() for managed-node2/debug 44071 1727204652.91352: worker is 1 (out of 1 available) 44071 1727204652.91367: exiting _queue_task() for managed-node2/debug 44071 1727204652.91381: done queuing things up, now waiting for results queue to drain 44071 1727204652.91383: waiting for pending results... 44071 1727204652.91885: running TaskExecutor() for managed-node2/TASK: Show result 44071 1727204652.91892: in run() - task 127b8e07-fff9-c964-7471-000000001090 44071 1727204652.91897: variable 'ansible_search_path' from source: unknown 44071 1727204652.91902: variable 'ansible_search_path' from source: unknown 44071 1727204652.91905: calling self._execute() 44071 1727204652.92016: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204652.92029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204652.92042: variable 'omit' from source: magic vars 44071 1727204652.92524: variable 'ansible_distribution_major_version' from source: facts 44071 1727204652.92549: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204652.92556: variable 'omit' from source: magic vars 44071 1727204652.92621: variable 'omit' from source: magic vars 44071 1727204652.92675: variable 'omit' from source: magic vars 44071 1727204652.92771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204652.92774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204652.92796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204652.92815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204652.92830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204652.92871: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204652.92877: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204652.92880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204652.93194: Set connection var ansible_connection to ssh 44071 1727204652.93198: Set connection var ansible_timeout to 10 44071 1727204652.93201: Set connection var ansible_pipelining to False 44071 1727204652.93203: Set connection var ansible_shell_type to sh 44071 1727204652.93206: Set connection var ansible_shell_executable to /bin/sh 44071 1727204652.93208: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204652.93210: variable 'ansible_shell_executable' from source: unknown 44071 1727204652.93212: variable 'ansible_connection' from source: unknown 44071 1727204652.93216: variable 'ansible_module_compression' from source: unknown 44071 1727204652.93218: variable 'ansible_shell_type' from source: unknown 44071 1727204652.93221: variable 'ansible_shell_executable' from source: unknown 44071 1727204652.93224: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204652.93227: variable 'ansible_pipelining' from source: unknown 44071 1727204652.93229: variable 'ansible_timeout' from source: unknown 44071 1727204652.93232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204652.93363: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204652.93377: variable 'omit' from source: magic vars 44071 1727204652.93382: starting attempt loop 44071 1727204652.93385: running the handler 44071 1727204652.93457: variable '__network_connections_result' from source: set_fact 44071 1727204652.93561: variable '__network_connections_result' from source: set_fact 44071 1727204652.94096: handler run complete 44071 1727204652.94100: attempt loop complete, returning result 44071 1727204652.94102: _execute() done 44071 1727204652.94105: dumping result to json 44071 1727204652.94107: done dumping result, returning 44071 1727204652.94110: done running TaskExecutor() for managed-node2/TASK: Show result [127b8e07-fff9-c964-7471-000000001090] 44071 1727204652.94112: sending task result for task 127b8e07-fff9-c964-7471-000000001090 44071 1727204652.94215: done sending task result for task 127b8e07-fff9-c964-7471-000000001090 44071 1727204652.94217: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, beccd2e1-72f3-4d73-aac6-77978c2859f8\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, beccd2e1-72f3-4d73-aac6-77978c2859f8" ] } } 44071 1727204652.94516: no more pending results, returning what we have 44071 1727204652.94520: results queue empty 44071 1727204652.94521: checking for any_errors_fatal 44071 1727204652.94523: done checking for any_errors_fatal 44071 1727204652.94524: checking for max_fail_percentage 44071 1727204652.94525: done checking for max_fail_percentage 44071 1727204652.94526: checking to see if all hosts have failed and the running result is not ok 44071 1727204652.94527: done checking to see if all hosts have failed 44071 1727204652.94528: getting the remaining hosts for this loop 44071 1727204652.94530: done getting the remaining hosts for this loop 44071 1727204652.94541: getting the next task for host managed-node2 44071 1727204652.94559: done getting next task for host managed-node2 44071 1727204652.94568: ^ task is: TASK: Include network role 44071 1727204652.94572: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204652.94579: getting variables 44071 1727204652.94584: in VariableManager get_vars() 44071 1727204652.94623: Calling all_inventory to load vars for managed-node2 44071 1727204652.94630: Calling groups_inventory to load vars for managed-node2 44071 1727204652.94640: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204652.94661: Calling all_plugins_play to load vars for managed-node2 44071 1727204652.94820: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204652.94827: Calling groups_plugins_play to load vars for managed-node2 44071 1727204652.98075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204653.02392: done with get_vars() 44071 1727204653.02459: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.117) 0:01:05.342 ***** 44071 1727204653.02587: entering _queue_task() for managed-node2/include_role 44071 1727204653.03199: worker is 1 (out of 1 available) 44071 1727204653.03212: exiting _queue_task() for managed-node2/include_role 44071 1727204653.03227: done queuing things up, now waiting for results queue to drain 44071 1727204653.03229: waiting for pending results... 44071 1727204653.03428: running TaskExecutor() for managed-node2/TASK: Include network role 44071 1727204653.03799: in run() - task 127b8e07-fff9-c964-7471-000000001094 44071 1727204653.03876: variable 'ansible_search_path' from source: unknown 44071 1727204653.03879: variable 'ansible_search_path' from source: unknown 44071 1727204653.03883: calling self._execute() 44071 1727204653.04270: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204653.04278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204653.04289: variable 'omit' from source: magic vars 44071 1727204653.05676: variable 'ansible_distribution_major_version' from source: facts 44071 1727204653.05681: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204653.05683: _execute() done 44071 1727204653.05686: dumping result to json 44071 1727204653.05688: done dumping result, returning 44071 1727204653.05692: done running TaskExecutor() for managed-node2/TASK: Include network role [127b8e07-fff9-c964-7471-000000001094] 44071 1727204653.05694: sending task result for task 127b8e07-fff9-c964-7471-000000001094 44071 1727204653.06312: no more pending results, returning what we have 44071 1727204653.06319: in VariableManager get_vars() 44071 1727204653.06372: Calling all_inventory to load vars for managed-node2 44071 1727204653.06376: Calling groups_inventory to load vars for managed-node2 44071 1727204653.06380: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204653.06397: Calling all_plugins_play to load vars for managed-node2 44071 1727204653.06401: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204653.06403: Calling groups_plugins_play to load vars for managed-node2 44071 1727204653.07477: done sending task result for task 127b8e07-fff9-c964-7471-000000001094 44071 1727204653.07483: WORKER PROCESS EXITING 44071 1727204653.11371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204653.15944: done with get_vars() 44071 1727204653.16186: variable 'ansible_search_path' from source: unknown 44071 1727204653.16188: variable 'ansible_search_path' from source: unknown 44071 1727204653.16345: variable 'omit' from source: magic vars 44071 1727204653.16597: variable 'omit' from source: magic vars 44071 1727204653.16616: variable 'omit' from source: magic vars 44071 1727204653.16620: we have included files to process 44071 1727204653.16621: generating all_blocks data 44071 1727204653.16623: done generating all_blocks data 44071 1727204653.16629: processing included file: fedora.linux_system_roles.network 44071 1727204653.16655: in VariableManager get_vars() 44071 1727204653.16880: done with get_vars() 44071 1727204653.16915: in VariableManager get_vars() 44071 1727204653.16936: done with get_vars() 44071 1727204653.16981: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44071 1727204653.17098: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44071 1727204653.17380: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44071 1727204653.18342: in VariableManager get_vars() 44071 1727204653.18577: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204653.24012: iterating over new_blocks loaded from include file 44071 1727204653.24017: in VariableManager get_vars() 44071 1727204653.24041: done with get_vars() 44071 1727204653.24043: filtering new block on tags 44071 1727204653.25197: done filtering new block on tags 44071 1727204653.25203: in VariableManager get_vars() 44071 1727204653.25224: done with get_vars() 44071 1727204653.25226: filtering new block on tags 44071 1727204653.25247: done filtering new block on tags 44071 1727204653.25249: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 44071 1727204653.25256: extending task lists for all hosts with included blocks 44071 1727204653.25799: done extending task lists 44071 1727204653.25801: done processing included files 44071 1727204653.25802: results queue empty 44071 1727204653.25803: checking for any_errors_fatal 44071 1727204653.25809: done checking for any_errors_fatal 44071 1727204653.25810: checking for max_fail_percentage 44071 1727204653.25811: done checking for max_fail_percentage 44071 1727204653.25812: checking to see if all hosts have failed and the running result is not ok 44071 1727204653.25813: done checking to see if all hosts have failed 44071 1727204653.25814: getting the remaining hosts for this loop 44071 1727204653.25815: done getting the remaining hosts for this loop 44071 1727204653.25818: getting the next task for host managed-node2 44071 1727204653.25823: done getting next task for host managed-node2 44071 1727204653.25827: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204653.25831: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204653.25845: getting variables 44071 1727204653.25846: in VariableManager get_vars() 44071 1727204653.25864: Calling all_inventory to load vars for managed-node2 44071 1727204653.25868: Calling groups_inventory to load vars for managed-node2 44071 1727204653.25870: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204653.25876: Calling all_plugins_play to load vars for managed-node2 44071 1727204653.25878: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204653.25882: Calling groups_plugins_play to load vars for managed-node2 44071 1727204653.30624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204653.37860: done with get_vars() 44071 1727204653.38108: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.356) 0:01:05.698 ***** 44071 1727204653.38205: entering _queue_task() for managed-node2/include_tasks 44071 1727204653.39179: worker is 1 (out of 1 available) 44071 1727204653.39195: exiting _queue_task() for managed-node2/include_tasks 44071 1727204653.39211: done queuing things up, now waiting for results queue to drain 44071 1727204653.39214: waiting for pending results... 44071 1727204653.39847: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204653.40194: in run() - task 127b8e07-fff9-c964-7471-00000000127a 44071 1727204653.40211: variable 'ansible_search_path' from source: unknown 44071 1727204653.40215: variable 'ansible_search_path' from source: unknown 44071 1727204653.40380: calling self._execute() 44071 1727204653.40623: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204653.40628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204653.40643: variable 'omit' from source: magic vars 44071 1727204653.42002: variable 'ansible_distribution_major_version' from source: facts 44071 1727204653.42007: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204653.42010: _execute() done 44071 1727204653.42014: dumping result to json 44071 1727204653.42017: done dumping result, returning 44071 1727204653.42020: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-c964-7471-00000000127a] 44071 1727204653.42023: sending task result for task 127b8e07-fff9-c964-7471-00000000127a 44071 1727204653.42116: done sending task result for task 127b8e07-fff9-c964-7471-00000000127a 44071 1727204653.42120: WORKER PROCESS EXITING 44071 1727204653.42183: no more pending results, returning what we have 44071 1727204653.42189: in VariableManager get_vars() 44071 1727204653.42242: Calling all_inventory to load vars for managed-node2 44071 1727204653.42246: Calling groups_inventory to load vars for managed-node2 44071 1727204653.42249: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204653.42265: Calling all_plugins_play to load vars for managed-node2 44071 1727204653.42269: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204653.42272: Calling groups_plugins_play to load vars for managed-node2 44071 1727204653.46495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204653.49353: done with get_vars() 44071 1727204653.49425: variable 'ansible_search_path' from source: unknown 44071 1727204653.49427: variable 'ansible_search_path' from source: unknown 44071 1727204653.49476: we have included files to process 44071 1727204653.49477: generating all_blocks data 44071 1727204653.49479: done generating all_blocks data 44071 1727204653.49483: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204653.49484: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204653.49487: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204653.50220: done processing included file 44071 1727204653.50223: iterating over new_blocks loaded from include file 44071 1727204653.50225: in VariableManager get_vars() 44071 1727204653.50256: done with get_vars() 44071 1727204653.50258: filtering new block on tags 44071 1727204653.50305: done filtering new block on tags 44071 1727204653.50309: in VariableManager get_vars() 44071 1727204653.50392: done with get_vars() 44071 1727204653.50395: filtering new block on tags 44071 1727204653.50446: done filtering new block on tags 44071 1727204653.50450: in VariableManager get_vars() 44071 1727204653.50519: done with get_vars() 44071 1727204653.50521: filtering new block on tags 44071 1727204653.50584: done filtering new block on tags 44071 1727204653.50587: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 44071 1727204653.50628: extending task lists for all hosts with included blocks 44071 1727204653.55642: done extending task lists 44071 1727204653.55644: done processing included files 44071 1727204653.55645: results queue empty 44071 1727204653.55646: checking for any_errors_fatal 44071 1727204653.55707: done checking for any_errors_fatal 44071 1727204653.55708: checking for max_fail_percentage 44071 1727204653.55710: done checking for max_fail_percentage 44071 1727204653.55711: checking to see if all hosts have failed and the running result is not ok 44071 1727204653.55712: done checking to see if all hosts have failed 44071 1727204653.55713: getting the remaining hosts for this loop 44071 1727204653.55714: done getting the remaining hosts for this loop 44071 1727204653.55717: getting the next task for host managed-node2 44071 1727204653.55724: done getting next task for host managed-node2 44071 1727204653.55727: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204653.55735: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204653.55750: getting variables 44071 1727204653.55752: in VariableManager get_vars() 44071 1727204653.55836: Calling all_inventory to load vars for managed-node2 44071 1727204653.55839: Calling groups_inventory to load vars for managed-node2 44071 1727204653.55842: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204653.55849: Calling all_plugins_play to load vars for managed-node2 44071 1727204653.55851: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204653.55854: Calling groups_plugins_play to load vars for managed-node2 44071 1727204653.59746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204653.65322: done with get_vars() 44071 1727204653.65431: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.274) 0:01:05.972 ***** 44071 1727204653.65626: entering _queue_task() for managed-node2/setup 44071 1727204653.66224: worker is 1 (out of 1 available) 44071 1727204653.66238: exiting _queue_task() for managed-node2/setup 44071 1727204653.66251: done queuing things up, now waiting for results queue to drain 44071 1727204653.66253: waiting for pending results... 44071 1727204653.66898: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204653.66905: in run() - task 127b8e07-fff9-c964-7471-0000000012d1 44071 1727204653.66909: variable 'ansible_search_path' from source: unknown 44071 1727204653.66915: variable 'ansible_search_path' from source: unknown 44071 1727204653.66983: calling self._execute() 44071 1727204653.67201: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204653.67221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204653.67237: variable 'omit' from source: magic vars 44071 1727204653.67808: variable 'ansible_distribution_major_version' from source: facts 44071 1727204653.67885: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204653.68153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204653.70760: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204653.70856: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204653.70972: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204653.70976: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204653.70987: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204653.71084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204653.71127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204653.71242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204653.71324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204653.71461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204653.71467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204653.71524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204653.71597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204653.71706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204653.71748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204653.72186: variable '__network_required_facts' from source: role '' defaults 44071 1727204653.72189: variable 'ansible_facts' from source: unknown 44071 1727204653.73605: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44071 1727204653.73793: when evaluation is False, skipping this task 44071 1727204653.73798: _execute() done 44071 1727204653.73801: dumping result to json 44071 1727204653.73803: done dumping result, returning 44071 1727204653.73806: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-c964-7471-0000000012d1] 44071 1727204653.73808: sending task result for task 127b8e07-fff9-c964-7471-0000000012d1 44071 1727204653.74170: done sending task result for task 127b8e07-fff9-c964-7471-0000000012d1 44071 1727204653.74176: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204653.74230: no more pending results, returning what we have 44071 1727204653.74234: results queue empty 44071 1727204653.74236: checking for any_errors_fatal 44071 1727204653.74238: done checking for any_errors_fatal 44071 1727204653.74239: checking for max_fail_percentage 44071 1727204653.74241: done checking for max_fail_percentage 44071 1727204653.74242: checking to see if all hosts have failed and the running result is not ok 44071 1727204653.74243: done checking to see if all hosts have failed 44071 1727204653.74243: getting the remaining hosts for this loop 44071 1727204653.74245: done getting the remaining hosts for this loop 44071 1727204653.74251: getting the next task for host managed-node2 44071 1727204653.74264: done getting next task for host managed-node2 44071 1727204653.74271: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204653.74278: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204653.74307: getting variables 44071 1727204653.74309: in VariableManager get_vars() 44071 1727204653.74358: Calling all_inventory to load vars for managed-node2 44071 1727204653.74361: Calling groups_inventory to load vars for managed-node2 44071 1727204653.74364: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204653.74387: Calling all_plugins_play to load vars for managed-node2 44071 1727204653.74392: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204653.74401: Calling groups_plugins_play to load vars for managed-node2 44071 1727204653.78986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204653.83714: done with get_vars() 44071 1727204653.83761: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.184) 0:01:06.157 ***** 44071 1727204653.84088: entering _queue_task() for managed-node2/stat 44071 1727204653.84904: worker is 1 (out of 1 available) 44071 1727204653.84918: exiting _queue_task() for managed-node2/stat 44071 1727204653.84932: done queuing things up, now waiting for results queue to drain 44071 1727204653.84934: waiting for pending results... 44071 1727204653.85784: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204653.86509: in run() - task 127b8e07-fff9-c964-7471-0000000012d3 44071 1727204653.86529: variable 'ansible_search_path' from source: unknown 44071 1727204653.86533: variable 'ansible_search_path' from source: unknown 44071 1727204653.86588: calling self._execute() 44071 1727204653.87067: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204653.87073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204653.87084: variable 'omit' from source: magic vars 44071 1727204653.88632: variable 'ansible_distribution_major_version' from source: facts 44071 1727204653.88651: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204653.89176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204653.89877: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204653.90156: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204653.90160: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204653.90163: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204653.90342: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204653.90511: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204653.90548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204653.90584: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204653.90823: variable '__network_is_ostree' from source: set_fact 44071 1727204653.91063: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204653.91068: when evaluation is False, skipping this task 44071 1727204653.91070: _execute() done 44071 1727204653.91073: dumping result to json 44071 1727204653.91076: done dumping result, returning 44071 1727204653.91078: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-c964-7471-0000000012d3] 44071 1727204653.91081: sending task result for task 127b8e07-fff9-c964-7471-0000000012d3 44071 1727204653.91168: done sending task result for task 127b8e07-fff9-c964-7471-0000000012d3 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204653.91230: no more pending results, returning what we have 44071 1727204653.91234: results queue empty 44071 1727204653.91236: checking for any_errors_fatal 44071 1727204653.91247: done checking for any_errors_fatal 44071 1727204653.91248: checking for max_fail_percentage 44071 1727204653.91250: done checking for max_fail_percentage 44071 1727204653.91251: checking to see if all hosts have failed and the running result is not ok 44071 1727204653.91252: done checking to see if all hosts have failed 44071 1727204653.91253: getting the remaining hosts for this loop 44071 1727204653.91255: done getting the remaining hosts for this loop 44071 1727204653.91260: getting the next task for host managed-node2 44071 1727204653.91273: done getting next task for host managed-node2 44071 1727204653.91277: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204653.91283: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204653.91314: getting variables 44071 1727204653.91316: in VariableManager get_vars() 44071 1727204653.91780: Calling all_inventory to load vars for managed-node2 44071 1727204653.91784: Calling groups_inventory to load vars for managed-node2 44071 1727204653.91787: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204653.91800: Calling all_plugins_play to load vars for managed-node2 44071 1727204653.91803: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204653.91807: Calling groups_plugins_play to load vars for managed-node2 44071 1727204653.92385: WORKER PROCESS EXITING 44071 1727204653.95638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204654.00382: done with get_vars() 44071 1727204654.00428: done getting variables 44071 1727204654.00705: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:04:14 -0400 (0:00:00.166) 0:01:06.323 ***** 44071 1727204654.00751: entering _queue_task() for managed-node2/set_fact 44071 1727204654.01445: worker is 1 (out of 1 available) 44071 1727204654.01458: exiting _queue_task() for managed-node2/set_fact 44071 1727204654.01875: done queuing things up, now waiting for results queue to drain 44071 1727204654.01878: waiting for pending results... 44071 1727204654.02113: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204654.02674: in run() - task 127b8e07-fff9-c964-7471-0000000012d4 44071 1727204654.02680: variable 'ansible_search_path' from source: unknown 44071 1727204654.02683: variable 'ansible_search_path' from source: unknown 44071 1727204654.02686: calling self._execute() 44071 1727204654.03019: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204654.03023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204654.03026: variable 'omit' from source: magic vars 44071 1727204654.03974: variable 'ansible_distribution_major_version' from source: facts 44071 1727204654.03979: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204654.04249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204654.04924: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204654.05171: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204654.05176: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204654.05179: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204654.05372: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204654.05637: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204654.05641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204654.05643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204654.05858: variable '__network_is_ostree' from source: set_fact 44071 1727204654.05874: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204654.05883: when evaluation is False, skipping this task 44071 1727204654.05890: _execute() done 44071 1727204654.05898: dumping result to json 44071 1727204654.05906: done dumping result, returning 44071 1727204654.05918: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-0000000012d4] 44071 1727204654.05928: sending task result for task 127b8e07-fff9-c964-7471-0000000012d4 44071 1727204654.06246: done sending task result for task 127b8e07-fff9-c964-7471-0000000012d4 44071 1727204654.06249: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204654.06313: no more pending results, returning what we have 44071 1727204654.06318: results queue empty 44071 1727204654.06319: checking for any_errors_fatal 44071 1727204654.06326: done checking for any_errors_fatal 44071 1727204654.06327: checking for max_fail_percentage 44071 1727204654.06330: done checking for max_fail_percentage 44071 1727204654.06331: checking to see if all hosts have failed and the running result is not ok 44071 1727204654.06332: done checking to see if all hosts have failed 44071 1727204654.06333: getting the remaining hosts for this loop 44071 1727204654.06334: done getting the remaining hosts for this loop 44071 1727204654.06340: getting the next task for host managed-node2 44071 1727204654.06355: done getting next task for host managed-node2 44071 1727204654.06360: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204654.06369: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204654.06399: getting variables 44071 1727204654.06402: in VariableManager get_vars() 44071 1727204654.06450: Calling all_inventory to load vars for managed-node2 44071 1727204654.06453: Calling groups_inventory to load vars for managed-node2 44071 1727204654.06455: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204654.06678: Calling all_plugins_play to load vars for managed-node2 44071 1727204654.06684: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204654.06688: Calling groups_plugins_play to load vars for managed-node2 44071 1727204654.12373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204654.19979: done with get_vars() 44071 1727204654.20030: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:04:14 -0400 (0:00:00.195) 0:01:06.519 ***** 44071 1727204654.20257: entering _queue_task() for managed-node2/service_facts 44071 1727204654.21745: worker is 1 (out of 1 available) 44071 1727204654.21761: exiting _queue_task() for managed-node2/service_facts 44071 1727204654.21873: done queuing things up, now waiting for results queue to drain 44071 1727204654.21876: waiting for pending results... 44071 1727204654.22677: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204654.23364: in run() - task 127b8e07-fff9-c964-7471-0000000012d6 44071 1727204654.23383: variable 'ansible_search_path' from source: unknown 44071 1727204654.23386: variable 'ansible_search_path' from source: unknown 44071 1727204654.23576: calling self._execute() 44071 1727204654.23834: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204654.23889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204654.23897: variable 'omit' from source: magic vars 44071 1727204654.24856: variable 'ansible_distribution_major_version' from source: facts 44071 1727204654.24869: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204654.24956: variable 'omit' from source: magic vars 44071 1727204654.25168: variable 'omit' from source: magic vars 44071 1727204654.25209: variable 'omit' from source: magic vars 44071 1727204654.25262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204654.25409: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204654.25432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204654.25456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204654.25470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204654.25622: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204654.25625: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204654.25628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204654.26211: Set connection var ansible_connection to ssh 44071 1727204654.26243: Set connection var ansible_timeout to 10 44071 1727204654.26248: Set connection var ansible_pipelining to False 44071 1727204654.26251: Set connection var ansible_shell_type to sh 44071 1727204654.26253: Set connection var ansible_shell_executable to /bin/sh 44071 1727204654.26255: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204654.26518: variable 'ansible_shell_executable' from source: unknown 44071 1727204654.26523: variable 'ansible_connection' from source: unknown 44071 1727204654.26526: variable 'ansible_module_compression' from source: unknown 44071 1727204654.26528: variable 'ansible_shell_type' from source: unknown 44071 1727204654.26531: variable 'ansible_shell_executable' from source: unknown 44071 1727204654.26533: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204654.26535: variable 'ansible_pipelining' from source: unknown 44071 1727204654.26537: variable 'ansible_timeout' from source: unknown 44071 1727204654.26539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204654.27173: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204654.27336: variable 'omit' from source: magic vars 44071 1727204654.27340: starting attempt loop 44071 1727204654.27343: running the handler 44071 1727204654.27392: _low_level_execute_command(): starting 44071 1727204654.27395: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204654.29141: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204654.29148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204654.29378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204654.29547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204654.29884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204654.31717: stdout chunk (state=3): >>>/root <<< 44071 1727204654.31865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204654.32086: stderr chunk (state=3): >>><<< 44071 1727204654.32090: stdout chunk (state=3): >>><<< 44071 1727204654.32229: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204654.32248: _low_level_execute_command(): starting 44071 1727204654.32258: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204654.3222876-47645-73391530597186 `" && echo ansible-tmp-1727204654.3222876-47645-73391530597186="` echo /root/.ansible/tmp/ansible-tmp-1727204654.3222876-47645-73391530597186 `" ) && sleep 0' 44071 1727204654.33721: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204654.33857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204654.33863: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204654.33881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204654.33884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204654.33886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204654.34328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204654.34405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204654.36571: stdout chunk (state=3): >>>ansible-tmp-1727204654.3222876-47645-73391530597186=/root/.ansible/tmp/ansible-tmp-1727204654.3222876-47645-73391530597186 <<< 44071 1727204654.36842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204654.37021: stderr chunk (state=3): >>><<< 44071 1727204654.37072: stdout chunk (state=3): >>><<< 44071 1727204654.37099: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204654.3222876-47645-73391530597186=/root/.ansible/tmp/ansible-tmp-1727204654.3222876-47645-73391530597186 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204654.37271: variable 'ansible_module_compression' from source: unknown 44071 1727204654.37374: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44071 1727204654.37377: variable 'ansible_facts' from source: unknown 44071 1727204654.37562: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204654.3222876-47645-73391530597186/AnsiballZ_service_facts.py 44071 1727204654.37972: Sending initial data 44071 1727204654.37978: Sent initial data (161 bytes) 44071 1727204654.39556: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204654.39671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204654.39778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204654.39960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204654.41737: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204654.41990: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204654.41994: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204654.3222876-47645-73391530597186/AnsiballZ_service_facts.py" <<< 44071 1727204654.41997: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpd0if37rt /root/.ansible/tmp/ansible-tmp-1727204654.3222876-47645-73391530597186/AnsiballZ_service_facts.py <<< 44071 1727204654.42136: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpd0if37rt" to remote "/root/.ansible/tmp/ansible-tmp-1727204654.3222876-47645-73391530597186/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204654.3222876-47645-73391530597186/AnsiballZ_service_facts.py" <<< 44071 1727204654.43807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204654.43906: stderr chunk (state=3): >>><<< 44071 1727204654.44173: stdout chunk (state=3): >>><<< 44071 1727204654.44177: done transferring module to remote 44071 1727204654.44180: _low_level_execute_command(): starting 44071 1727204654.44182: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204654.3222876-47645-73391530597186/ /root/.ansible/tmp/ansible-tmp-1727204654.3222876-47645-73391530597186/AnsiballZ_service_facts.py && sleep 0' 44071 1727204654.45736: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204654.45779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204654.45833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204654.45976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204654.47785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204654.47871: stderr chunk (state=3): >>><<< 44071 1727204654.47896: stdout chunk (state=3): >>><<< 44071 1727204654.47942: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204654.47951: _low_level_execute_command(): starting 44071 1727204654.47962: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204654.3222876-47645-73391530597186/AnsiballZ_service_facts.py && sleep 0' 44071 1727204654.49604: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204654.49829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204654.50128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204656.72101: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped<<< 44071 1727204656.72126: stdout chunk (state=3): >>>", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, <<< 44071 1727204656.72185: stdout chunk (state=3): >>>"systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44071 1727204656.73795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204656.73861: stderr chunk (state=3): >>><<< 44071 1727204656.73867: stdout chunk (state=3): >>><<< 44071 1727204656.73891: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204656.74785: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204654.3222876-47645-73391530597186/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204656.74791: _low_level_execute_command(): starting 44071 1727204656.74794: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204654.3222876-47645-73391530597186/ > /dev/null 2>&1 && sleep 0' 44071 1727204656.75274: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204656.75282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204656.75293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204656.75307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204656.75321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204656.75335: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204656.75340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204656.75449: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204656.75453: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204656.75455: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204656.75457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204656.75459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204656.75461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204656.75463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204656.75468: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204656.75470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204656.75506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204656.75510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204656.75522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204656.75642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204656.77608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204656.77672: stderr chunk (state=3): >>><<< 44071 1727204656.77676: stdout chunk (state=3): >>><<< 44071 1727204656.77693: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204656.77699: handler run complete 44071 1727204656.77851: variable 'ansible_facts' from source: unknown 44071 1727204656.77984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204656.78338: variable 'ansible_facts' from source: unknown 44071 1727204656.78444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204656.78601: attempt loop complete, returning result 44071 1727204656.78607: _execute() done 44071 1727204656.78609: dumping result to json 44071 1727204656.78652: done dumping result, returning 44071 1727204656.78661: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-c964-7471-0000000012d6] 44071 1727204656.78668: sending task result for task 127b8e07-fff9-c964-7471-0000000012d6 44071 1727204656.85658: done sending task result for task 127b8e07-fff9-c964-7471-0000000012d6 44071 1727204656.85662: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204656.85736: no more pending results, returning what we have 44071 1727204656.85739: results queue empty 44071 1727204656.85739: checking for any_errors_fatal 44071 1727204656.85743: done checking for any_errors_fatal 44071 1727204656.85743: checking for max_fail_percentage 44071 1727204656.85744: done checking for max_fail_percentage 44071 1727204656.85745: checking to see if all hosts have failed and the running result is not ok 44071 1727204656.85745: done checking to see if all hosts have failed 44071 1727204656.85746: getting the remaining hosts for this loop 44071 1727204656.85747: done getting the remaining hosts for this loop 44071 1727204656.85749: getting the next task for host managed-node2 44071 1727204656.85755: done getting next task for host managed-node2 44071 1727204656.85757: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204656.85762: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204656.85776: getting variables 44071 1727204656.85777: in VariableManager get_vars() 44071 1727204656.85803: Calling all_inventory to load vars for managed-node2 44071 1727204656.85805: Calling groups_inventory to load vars for managed-node2 44071 1727204656.85808: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204656.85817: Calling all_plugins_play to load vars for managed-node2 44071 1727204656.85819: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204656.85820: Calling groups_plugins_play to load vars for managed-node2 44071 1727204656.91745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204656.92972: done with get_vars() 44071 1727204656.93005: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:04:16 -0400 (0:00:02.728) 0:01:09.247 ***** 44071 1727204656.93077: entering _queue_task() for managed-node2/package_facts 44071 1727204656.93390: worker is 1 (out of 1 available) 44071 1727204656.93406: exiting _queue_task() for managed-node2/package_facts 44071 1727204656.93420: done queuing things up, now waiting for results queue to drain 44071 1727204656.93422: waiting for pending results... 44071 1727204656.93643: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204656.93795: in run() - task 127b8e07-fff9-c964-7471-0000000012d7 44071 1727204656.93810: variable 'ansible_search_path' from source: unknown 44071 1727204656.93814: variable 'ansible_search_path' from source: unknown 44071 1727204656.93850: calling self._execute() 44071 1727204656.93942: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204656.93948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204656.93959: variable 'omit' from source: magic vars 44071 1727204656.94297: variable 'ansible_distribution_major_version' from source: facts 44071 1727204656.94311: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204656.94317: variable 'omit' from source: magic vars 44071 1727204656.94387: variable 'omit' from source: magic vars 44071 1727204656.94421: variable 'omit' from source: magic vars 44071 1727204656.94460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204656.94493: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204656.94512: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204656.94527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204656.94547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204656.94574: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204656.94578: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204656.94580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204656.94663: Set connection var ansible_connection to ssh 44071 1727204656.94674: Set connection var ansible_timeout to 10 44071 1727204656.94680: Set connection var ansible_pipelining to False 44071 1727204656.94686: Set connection var ansible_shell_type to sh 44071 1727204656.94691: Set connection var ansible_shell_executable to /bin/sh 44071 1727204656.94698: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204656.94718: variable 'ansible_shell_executable' from source: unknown 44071 1727204656.94721: variable 'ansible_connection' from source: unknown 44071 1727204656.94724: variable 'ansible_module_compression' from source: unknown 44071 1727204656.94727: variable 'ansible_shell_type' from source: unknown 44071 1727204656.94730: variable 'ansible_shell_executable' from source: unknown 44071 1727204656.94735: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204656.94738: variable 'ansible_pipelining' from source: unknown 44071 1727204656.94742: variable 'ansible_timeout' from source: unknown 44071 1727204656.94744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204656.94913: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204656.94921: variable 'omit' from source: magic vars 44071 1727204656.94925: starting attempt loop 44071 1727204656.94929: running the handler 44071 1727204656.94944: _low_level_execute_command(): starting 44071 1727204656.94950: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204656.95516: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204656.95521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204656.95525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204656.95589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204656.95597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204656.95675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204656.97441: stdout chunk (state=3): >>>/root <<< 44071 1727204656.97548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204656.97617: stderr chunk (state=3): >>><<< 44071 1727204656.97620: stdout chunk (state=3): >>><<< 44071 1727204656.97646: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204656.97657: _low_level_execute_command(): starting 44071 1727204656.97664: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204656.976437-47735-95687902981337 `" && echo ansible-tmp-1727204656.976437-47735-95687902981337="` echo /root/.ansible/tmp/ansible-tmp-1727204656.976437-47735-95687902981337 `" ) && sleep 0' 44071 1727204656.98140: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204656.98180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204656.98184: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204656.98195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204656.98241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204656.98244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204656.98249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204656.98319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204657.00332: stdout chunk (state=3): >>>ansible-tmp-1727204656.976437-47735-95687902981337=/root/.ansible/tmp/ansible-tmp-1727204656.976437-47735-95687902981337 <<< 44071 1727204657.00444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204657.00505: stderr chunk (state=3): >>><<< 44071 1727204657.00509: stdout chunk (state=3): >>><<< 44071 1727204657.00535: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204656.976437-47735-95687902981337=/root/.ansible/tmp/ansible-tmp-1727204656.976437-47735-95687902981337 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204657.00590: variable 'ansible_module_compression' from source: unknown 44071 1727204657.00630: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44071 1727204657.00692: variable 'ansible_facts' from source: unknown 44071 1727204657.00815: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204656.976437-47735-95687902981337/AnsiballZ_package_facts.py 44071 1727204657.00944: Sending initial data 44071 1727204657.00947: Sent initial data (160 bytes) 44071 1727204657.01457: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204657.01462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204657.01464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204657.01513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204657.01523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204657.01596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204657.03227: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204657.03289: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204657.03366: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp1boni_cp /root/.ansible/tmp/ansible-tmp-1727204656.976437-47735-95687902981337/AnsiballZ_package_facts.py <<< 44071 1727204657.03370: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204656.976437-47735-95687902981337/AnsiballZ_package_facts.py" <<< 44071 1727204657.03435: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp1boni_cp" to remote "/root/.ansible/tmp/ansible-tmp-1727204656.976437-47735-95687902981337/AnsiballZ_package_facts.py" <<< 44071 1727204657.03439: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204656.976437-47735-95687902981337/AnsiballZ_package_facts.py" <<< 44071 1727204657.04667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204657.04742: stderr chunk (state=3): >>><<< 44071 1727204657.04748: stdout chunk (state=3): >>><<< 44071 1727204657.04770: done transferring module to remote 44071 1727204657.04784: _low_level_execute_command(): starting 44071 1727204657.04787: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204656.976437-47735-95687902981337/ /root/.ansible/tmp/ansible-tmp-1727204656.976437-47735-95687902981337/AnsiballZ_package_facts.py && sleep 0' 44071 1727204657.05298: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204657.05302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204657.05305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204657.05308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204657.05372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204657.05379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204657.05381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204657.05442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204657.07276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204657.07335: stderr chunk (state=3): >>><<< 44071 1727204657.07339: stdout chunk (state=3): >>><<< 44071 1727204657.07352: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204657.07357: _low_level_execute_command(): starting 44071 1727204657.07359: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204656.976437-47735-95687902981337/AnsiballZ_package_facts.py && sleep 0' 44071 1727204657.07842: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204657.07846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204657.07886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204657.07891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204657.07894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204657.07896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204657.07950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204657.07955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204657.07958: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204657.08039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204657.71067: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 44071 1727204657.71093: stdout chunk (state=3): >>>me": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40",<<< 44071 1727204657.71159: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lib<<< 44071 1727204657.71179: stdout chunk (state=3): >>>xmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": <<< 44071 1727204657.71263: stdout chunk (state=3): >>>"x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50<<< 44071 1727204657.71274: stdout chunk (state=3): >>>, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "s<<< 44071 1727204657.71278: stdout chunk (state=3): >>>ource": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-t<<< 44071 1727204657.71279: stdout chunk (state=3): >>>ools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44071 1727204657.73144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204657.73208: stderr chunk (state=3): >>><<< 44071 1727204657.73212: stdout chunk (state=3): >>><<< 44071 1727204657.73256: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204657.75117: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204656.976437-47735-95687902981337/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204657.75138: _low_level_execute_command(): starting 44071 1727204657.75142: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204656.976437-47735-95687902981337/ > /dev/null 2>&1 && sleep 0' 44071 1727204657.75652: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204657.75657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204657.75661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204657.75664: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204657.75725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204657.75729: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204657.75733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204657.75810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204657.77759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204657.77829: stderr chunk (state=3): >>><<< 44071 1727204657.77835: stdout chunk (state=3): >>><<< 44071 1727204657.77875: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204657.77878: handler run complete 44071 1727204657.78527: variable 'ansible_facts' from source: unknown 44071 1727204657.78959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204657.81004: variable 'ansible_facts' from source: unknown 44071 1727204657.81551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204657.82459: attempt loop complete, returning result 44071 1727204657.82479: _execute() done 44071 1727204657.82482: dumping result to json 44071 1727204657.82646: done dumping result, returning 44071 1727204657.82655: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-c964-7471-0000000012d7] 44071 1727204657.82660: sending task result for task 127b8e07-fff9-c964-7471-0000000012d7 44071 1727204657.84662: done sending task result for task 127b8e07-fff9-c964-7471-0000000012d7 44071 1727204657.84668: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204657.84771: no more pending results, returning what we have 44071 1727204657.84774: results queue empty 44071 1727204657.84774: checking for any_errors_fatal 44071 1727204657.84779: done checking for any_errors_fatal 44071 1727204657.84780: checking for max_fail_percentage 44071 1727204657.84781: done checking for max_fail_percentage 44071 1727204657.84781: checking to see if all hosts have failed and the running result is not ok 44071 1727204657.84782: done checking to see if all hosts have failed 44071 1727204657.84783: getting the remaining hosts for this loop 44071 1727204657.84783: done getting the remaining hosts for this loop 44071 1727204657.84786: getting the next task for host managed-node2 44071 1727204657.84792: done getting next task for host managed-node2 44071 1727204657.84794: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204657.84799: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204657.84806: getting variables 44071 1727204657.84807: in VariableManager get_vars() 44071 1727204657.84835: Calling all_inventory to load vars for managed-node2 44071 1727204657.84837: Calling groups_inventory to load vars for managed-node2 44071 1727204657.84838: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204657.84848: Calling all_plugins_play to load vars for managed-node2 44071 1727204657.84850: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204657.84852: Calling groups_plugins_play to load vars for managed-node2 44071 1727204657.86164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204657.87673: done with get_vars() 44071 1727204657.87697: done getting variables 44071 1727204657.87758: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:04:17 -0400 (0:00:00.947) 0:01:10.194 ***** 44071 1727204657.87788: entering _queue_task() for managed-node2/debug 44071 1727204657.88089: worker is 1 (out of 1 available) 44071 1727204657.88104: exiting _queue_task() for managed-node2/debug 44071 1727204657.88118: done queuing things up, now waiting for results queue to drain 44071 1727204657.88120: waiting for pending results... 44071 1727204657.88327: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204657.88454: in run() - task 127b8e07-fff9-c964-7471-00000000127b 44071 1727204657.88472: variable 'ansible_search_path' from source: unknown 44071 1727204657.88478: variable 'ansible_search_path' from source: unknown 44071 1727204657.88508: calling self._execute() 44071 1727204657.88599: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204657.88604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204657.88613: variable 'omit' from source: magic vars 44071 1727204657.88956: variable 'ansible_distribution_major_version' from source: facts 44071 1727204657.88968: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204657.88974: variable 'omit' from source: magic vars 44071 1727204657.89029: variable 'omit' from source: magic vars 44071 1727204657.89108: variable 'network_provider' from source: set_fact 44071 1727204657.89134: variable 'omit' from source: magic vars 44071 1727204657.89166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204657.89197: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204657.89215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204657.89236: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204657.89247: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204657.89274: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204657.89278: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204657.89281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204657.89361: Set connection var ansible_connection to ssh 44071 1727204657.89369: Set connection var ansible_timeout to 10 44071 1727204657.89375: Set connection var ansible_pipelining to False 44071 1727204657.89380: Set connection var ansible_shell_type to sh 44071 1727204657.89386: Set connection var ansible_shell_executable to /bin/sh 44071 1727204657.89452: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204657.89458: variable 'ansible_shell_executable' from source: unknown 44071 1727204657.89462: variable 'ansible_connection' from source: unknown 44071 1727204657.89466: variable 'ansible_module_compression' from source: unknown 44071 1727204657.89477: variable 'ansible_shell_type' from source: unknown 44071 1727204657.89480: variable 'ansible_shell_executable' from source: unknown 44071 1727204657.89483: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204657.89485: variable 'ansible_pipelining' from source: unknown 44071 1727204657.89487: variable 'ansible_timeout' from source: unknown 44071 1727204657.89490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204657.89552: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204657.89574: variable 'omit' from source: magic vars 44071 1727204657.89585: starting attempt loop 44071 1727204657.89589: running the handler 44071 1727204657.89612: handler run complete 44071 1727204657.89626: attempt loop complete, returning result 44071 1727204657.89629: _execute() done 44071 1727204657.89634: dumping result to json 44071 1727204657.89637: done dumping result, returning 44071 1727204657.89644: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-c964-7471-00000000127b] 44071 1727204657.89649: sending task result for task 127b8e07-fff9-c964-7471-00000000127b 44071 1727204657.89754: done sending task result for task 127b8e07-fff9-c964-7471-00000000127b 44071 1727204657.89757: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 44071 1727204657.89847: no more pending results, returning what we have 44071 1727204657.89851: results queue empty 44071 1727204657.89851: checking for any_errors_fatal 44071 1727204657.89861: done checking for any_errors_fatal 44071 1727204657.89861: checking for max_fail_percentage 44071 1727204657.89863: done checking for max_fail_percentage 44071 1727204657.89864: checking to see if all hosts have failed and the running result is not ok 44071 1727204657.89864: done checking to see if all hosts have failed 44071 1727204657.89868: getting the remaining hosts for this loop 44071 1727204657.89869: done getting the remaining hosts for this loop 44071 1727204657.89874: getting the next task for host managed-node2 44071 1727204657.89883: done getting next task for host managed-node2 44071 1727204657.89887: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204657.89892: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204657.89905: getting variables 44071 1727204657.89907: in VariableManager get_vars() 44071 1727204657.89948: Calling all_inventory to load vars for managed-node2 44071 1727204657.89951: Calling groups_inventory to load vars for managed-node2 44071 1727204657.89953: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204657.89963: Calling all_plugins_play to load vars for managed-node2 44071 1727204657.89975: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204657.89980: Calling groups_plugins_play to load vars for managed-node2 44071 1727204657.91328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204657.93162: done with get_vars() 44071 1727204657.93207: done getting variables 44071 1727204657.93290: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:04:17 -0400 (0:00:00.055) 0:01:10.249 ***** 44071 1727204657.93342: entering _queue_task() for managed-node2/fail 44071 1727204657.93759: worker is 1 (out of 1 available) 44071 1727204657.93778: exiting _queue_task() for managed-node2/fail 44071 1727204657.93794: done queuing things up, now waiting for results queue to drain 44071 1727204657.93795: waiting for pending results... 44071 1727204657.94150: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204657.94350: in run() - task 127b8e07-fff9-c964-7471-00000000127c 44071 1727204657.94380: variable 'ansible_search_path' from source: unknown 44071 1727204657.94389: variable 'ansible_search_path' from source: unknown 44071 1727204657.94439: calling self._execute() 44071 1727204657.94592: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204657.94611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204657.94620: variable 'omit' from source: magic vars 44071 1727204657.95064: variable 'ansible_distribution_major_version' from source: facts 44071 1727204657.95080: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204657.95251: variable 'network_state' from source: role '' defaults 44071 1727204657.95255: Evaluated conditional (network_state != {}): False 44071 1727204657.95258: when evaluation is False, skipping this task 44071 1727204657.95260: _execute() done 44071 1727204657.95265: dumping result to json 44071 1727204657.95268: done dumping result, returning 44071 1727204657.95271: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-c964-7471-00000000127c] 44071 1727204657.95273: sending task result for task 127b8e07-fff9-c964-7471-00000000127c 44071 1727204657.95421: done sending task result for task 127b8e07-fff9-c964-7471-00000000127c 44071 1727204657.95424: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204657.95486: no more pending results, returning what we have 44071 1727204657.95490: results queue empty 44071 1727204657.95491: checking for any_errors_fatal 44071 1727204657.95506: done checking for any_errors_fatal 44071 1727204657.95507: checking for max_fail_percentage 44071 1727204657.95509: done checking for max_fail_percentage 44071 1727204657.95510: checking to see if all hosts have failed and the running result is not ok 44071 1727204657.95511: done checking to see if all hosts have failed 44071 1727204657.95511: getting the remaining hosts for this loop 44071 1727204657.95513: done getting the remaining hosts for this loop 44071 1727204657.95517: getting the next task for host managed-node2 44071 1727204657.95525: done getting next task for host managed-node2 44071 1727204657.95535: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204657.95541: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204657.95566: getting variables 44071 1727204657.95568: in VariableManager get_vars() 44071 1727204657.95608: Calling all_inventory to load vars for managed-node2 44071 1727204657.95611: Calling groups_inventory to load vars for managed-node2 44071 1727204657.95614: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204657.95629: Calling all_plugins_play to load vars for managed-node2 44071 1727204657.95633: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204657.95637: Calling groups_plugins_play to load vars for managed-node2 44071 1727204657.97249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204657.99418: done with get_vars() 44071 1727204657.99462: done getting variables 44071 1727204657.99531: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:04:17 -0400 (0:00:00.062) 0:01:10.312 ***** 44071 1727204657.99571: entering _queue_task() for managed-node2/fail 44071 1727204657.99980: worker is 1 (out of 1 available) 44071 1727204657.99995: exiting _queue_task() for managed-node2/fail 44071 1727204658.00011: done queuing things up, now waiting for results queue to drain 44071 1727204658.00012: waiting for pending results... 44071 1727204658.00483: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204658.00552: in run() - task 127b8e07-fff9-c964-7471-00000000127d 44071 1727204658.00610: variable 'ansible_search_path' from source: unknown 44071 1727204658.00614: variable 'ansible_search_path' from source: unknown 44071 1727204658.00643: calling self._execute() 44071 1727204658.00769: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204658.00791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204658.00826: variable 'omit' from source: magic vars 44071 1727204658.01285: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.01304: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204658.01462: variable 'network_state' from source: role '' defaults 44071 1727204658.01552: Evaluated conditional (network_state != {}): False 44071 1727204658.01556: when evaluation is False, skipping this task 44071 1727204658.01558: _execute() done 44071 1727204658.01560: dumping result to json 44071 1727204658.01562: done dumping result, returning 44071 1727204658.01567: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-c964-7471-00000000127d] 44071 1727204658.01570: sending task result for task 127b8e07-fff9-c964-7471-00000000127d 44071 1727204658.01882: done sending task result for task 127b8e07-fff9-c964-7471-00000000127d 44071 1727204658.01887: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204658.01935: no more pending results, returning what we have 44071 1727204658.01939: results queue empty 44071 1727204658.01940: checking for any_errors_fatal 44071 1727204658.01947: done checking for any_errors_fatal 44071 1727204658.01948: checking for max_fail_percentage 44071 1727204658.01950: done checking for max_fail_percentage 44071 1727204658.01951: checking to see if all hosts have failed and the running result is not ok 44071 1727204658.01952: done checking to see if all hosts have failed 44071 1727204658.01952: getting the remaining hosts for this loop 44071 1727204658.01954: done getting the remaining hosts for this loop 44071 1727204658.01958: getting the next task for host managed-node2 44071 1727204658.01967: done getting next task for host managed-node2 44071 1727204658.01972: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204658.01977: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204658.02000: getting variables 44071 1727204658.02002: in VariableManager get_vars() 44071 1727204658.02042: Calling all_inventory to load vars for managed-node2 44071 1727204658.02044: Calling groups_inventory to load vars for managed-node2 44071 1727204658.02047: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204658.02058: Calling all_plugins_play to load vars for managed-node2 44071 1727204658.02061: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204658.02064: Calling groups_plugins_play to load vars for managed-node2 44071 1727204658.03838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204658.06249: done with get_vars() 44071 1727204658.06286: done getting variables 44071 1727204658.06368: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.068) 0:01:10.380 ***** 44071 1727204658.06412: entering _queue_task() for managed-node2/fail 44071 1727204658.06848: worker is 1 (out of 1 available) 44071 1727204658.06862: exiting _queue_task() for managed-node2/fail 44071 1727204658.07078: done queuing things up, now waiting for results queue to drain 44071 1727204658.07080: waiting for pending results... 44071 1727204658.07233: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204658.07472: in run() - task 127b8e07-fff9-c964-7471-00000000127e 44071 1727204658.07475: variable 'ansible_search_path' from source: unknown 44071 1727204658.07479: variable 'ansible_search_path' from source: unknown 44071 1727204658.07504: calling self._execute() 44071 1727204658.07621: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204658.07641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204658.07669: variable 'omit' from source: magic vars 44071 1727204658.08121: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.08141: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204658.08375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204658.10922: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204658.11015: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204658.11070: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204658.11124: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204658.11158: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204658.11261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.11323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.11470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.11474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.11477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.11542: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.11571: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44071 1727204658.11721: variable 'ansible_distribution' from source: facts 44071 1727204658.11734: variable '__network_rh_distros' from source: role '' defaults 44071 1727204658.11752: Evaluated conditional (ansible_distribution in __network_rh_distros): False 44071 1727204658.11764: when evaluation is False, skipping this task 44071 1727204658.11774: _execute() done 44071 1727204658.11782: dumping result to json 44071 1727204658.11791: done dumping result, returning 44071 1727204658.11811: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-c964-7471-00000000127e] 44071 1727204658.11823: sending task result for task 127b8e07-fff9-c964-7471-00000000127e skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 44071 1727204658.11999: no more pending results, returning what we have 44071 1727204658.12004: results queue empty 44071 1727204658.12005: checking for any_errors_fatal 44071 1727204658.12014: done checking for any_errors_fatal 44071 1727204658.12015: checking for max_fail_percentage 44071 1727204658.12017: done checking for max_fail_percentage 44071 1727204658.12018: checking to see if all hosts have failed and the running result is not ok 44071 1727204658.12019: done checking to see if all hosts have failed 44071 1727204658.12020: getting the remaining hosts for this loop 44071 1727204658.12021: done getting the remaining hosts for this loop 44071 1727204658.12027: getting the next task for host managed-node2 44071 1727204658.12037: done getting next task for host managed-node2 44071 1727204658.12042: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204658.12049: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204658.12080: getting variables 44071 1727204658.12082: in VariableManager get_vars() 44071 1727204658.12129: Calling all_inventory to load vars for managed-node2 44071 1727204658.12132: Calling groups_inventory to load vars for managed-node2 44071 1727204658.12134: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204658.12147: Calling all_plugins_play to load vars for managed-node2 44071 1727204658.12151: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204658.12155: Calling groups_plugins_play to load vars for managed-node2 44071 1727204658.13184: done sending task result for task 127b8e07-fff9-c964-7471-00000000127e 44071 1727204658.13188: WORKER PROCESS EXITING 44071 1727204658.14326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204658.16514: done with get_vars() 44071 1727204658.16564: done getting variables 44071 1727204658.16648: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.102) 0:01:10.483 ***** 44071 1727204658.16690: entering _queue_task() for managed-node2/dnf 44071 1727204658.17211: worker is 1 (out of 1 available) 44071 1727204658.17226: exiting _queue_task() for managed-node2/dnf 44071 1727204658.17239: done queuing things up, now waiting for results queue to drain 44071 1727204658.17241: waiting for pending results... 44071 1727204658.17505: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204658.17709: in run() - task 127b8e07-fff9-c964-7471-00000000127f 44071 1727204658.17734: variable 'ansible_search_path' from source: unknown 44071 1727204658.17744: variable 'ansible_search_path' from source: unknown 44071 1727204658.17795: calling self._execute() 44071 1727204658.17921: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204658.17936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204658.17957: variable 'omit' from source: magic vars 44071 1727204658.18428: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.18449: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204658.18688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204658.21788: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204658.21871: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204658.21918: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204658.21970: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204658.22006: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204658.22112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.22151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.22193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.22246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.22269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.22420: variable 'ansible_distribution' from source: facts 44071 1727204658.22431: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.22446: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44071 1727204658.22583: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204658.22742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.22776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.22833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.22864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.22888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.23047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.23051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.23053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.23057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.23073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.23124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.23153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.23191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.23239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.23260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.23448: variable 'network_connections' from source: include params 44071 1727204658.23468: variable 'interface' from source: play vars 44071 1727204658.23549: variable 'interface' from source: play vars 44071 1727204658.23672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204658.23882: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204658.23932: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204658.23972: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204658.24015: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204658.24148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204658.24152: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204658.24167: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.24192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204658.24262: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204658.24571: variable 'network_connections' from source: include params 44071 1727204658.24590: variable 'interface' from source: play vars 44071 1727204658.24676: variable 'interface' from source: play vars 44071 1727204658.24714: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204658.24726: when evaluation is False, skipping this task 44071 1727204658.24733: _execute() done 44071 1727204658.24799: dumping result to json 44071 1727204658.24802: done dumping result, returning 44071 1727204658.24805: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-00000000127f] 44071 1727204658.24807: sending task result for task 127b8e07-fff9-c964-7471-00000000127f skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204658.24972: no more pending results, returning what we have 44071 1727204658.24977: results queue empty 44071 1727204658.24978: checking for any_errors_fatal 44071 1727204658.24985: done checking for any_errors_fatal 44071 1727204658.24986: checking for max_fail_percentage 44071 1727204658.24988: done checking for max_fail_percentage 44071 1727204658.24989: checking to see if all hosts have failed and the running result is not ok 44071 1727204658.24990: done checking to see if all hosts have failed 44071 1727204658.24991: getting the remaining hosts for this loop 44071 1727204658.24993: done getting the remaining hosts for this loop 44071 1727204658.24998: getting the next task for host managed-node2 44071 1727204658.25009: done getting next task for host managed-node2 44071 1727204658.25014: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204658.25020: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204658.25050: getting variables 44071 1727204658.25052: in VariableManager get_vars() 44071 1727204658.25412: Calling all_inventory to load vars for managed-node2 44071 1727204658.25416: Calling groups_inventory to load vars for managed-node2 44071 1727204658.25418: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204658.25431: Calling all_plugins_play to load vars for managed-node2 44071 1727204658.25434: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204658.25438: Calling groups_plugins_play to load vars for managed-node2 44071 1727204658.26084: done sending task result for task 127b8e07-fff9-c964-7471-00000000127f 44071 1727204658.26090: WORKER PROCESS EXITING 44071 1727204658.27577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204658.29772: done with get_vars() 44071 1727204658.29817: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204658.29905: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.132) 0:01:10.615 ***** 44071 1727204658.29939: entering _queue_task() for managed-node2/yum 44071 1727204658.30353: worker is 1 (out of 1 available) 44071 1727204658.30574: exiting _queue_task() for managed-node2/yum 44071 1727204658.30588: done queuing things up, now waiting for results queue to drain 44071 1727204658.30590: waiting for pending results... 44071 1727204658.30740: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204658.30955: in run() - task 127b8e07-fff9-c964-7471-000000001280 44071 1727204658.30981: variable 'ansible_search_path' from source: unknown 44071 1727204658.30990: variable 'ansible_search_path' from source: unknown 44071 1727204658.31040: calling self._execute() 44071 1727204658.31159: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204658.31175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204658.31188: variable 'omit' from source: magic vars 44071 1727204658.31644: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.31669: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204658.31888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204658.34624: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204658.34708: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204658.34757: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204658.34803: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204658.34836: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204658.34930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.34988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.35020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.35074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.35094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.35207: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.35233: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44071 1727204658.35241: when evaluation is False, skipping this task 44071 1727204658.35249: _execute() done 44071 1727204658.35257: dumping result to json 44071 1727204658.35264: done dumping result, returning 44071 1727204658.35283: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001280] 44071 1727204658.35293: sending task result for task 127b8e07-fff9-c964-7471-000000001280 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44071 1727204658.35476: no more pending results, returning what we have 44071 1727204658.35481: results queue empty 44071 1727204658.35482: checking for any_errors_fatal 44071 1727204658.35491: done checking for any_errors_fatal 44071 1727204658.35492: checking for max_fail_percentage 44071 1727204658.35494: done checking for max_fail_percentage 44071 1727204658.35495: checking to see if all hosts have failed and the running result is not ok 44071 1727204658.35496: done checking to see if all hosts have failed 44071 1727204658.35496: getting the remaining hosts for this loop 44071 1727204658.35498: done getting the remaining hosts for this loop 44071 1727204658.35503: getting the next task for host managed-node2 44071 1727204658.35512: done getting next task for host managed-node2 44071 1727204658.35517: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204658.35524: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204658.35554: getting variables 44071 1727204658.35556: in VariableManager get_vars() 44071 1727204658.35603: Calling all_inventory to load vars for managed-node2 44071 1727204658.35606: Calling groups_inventory to load vars for managed-node2 44071 1727204658.35608: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204658.35622: Calling all_plugins_play to load vars for managed-node2 44071 1727204658.35625: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204658.35628: Calling groups_plugins_play to load vars for managed-node2 44071 1727204658.36484: done sending task result for task 127b8e07-fff9-c964-7471-000000001280 44071 1727204658.36488: WORKER PROCESS EXITING 44071 1727204658.37852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204658.40306: done with get_vars() 44071 1727204658.40346: done getting variables 44071 1727204658.40415: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.105) 0:01:10.721 ***** 44071 1727204658.40457: entering _queue_task() for managed-node2/fail 44071 1727204658.40991: worker is 1 (out of 1 available) 44071 1727204658.41005: exiting _queue_task() for managed-node2/fail 44071 1727204658.41019: done queuing things up, now waiting for results queue to drain 44071 1727204658.41021: waiting for pending results... 44071 1727204658.41280: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204658.41470: in run() - task 127b8e07-fff9-c964-7471-000000001281 44071 1727204658.41496: variable 'ansible_search_path' from source: unknown 44071 1727204658.41506: variable 'ansible_search_path' from source: unknown 44071 1727204658.41550: calling self._execute() 44071 1727204658.41773: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204658.41778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204658.41781: variable 'omit' from source: magic vars 44071 1727204658.42155: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.42178: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204658.42320: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204658.42555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204658.45637: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204658.45855: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204658.45958: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204658.46063: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204658.46164: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204658.46385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.46448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.46673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.46676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.46679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.46819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.46855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.46924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.47221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.47225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.47227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.47371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.47384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.47439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.47672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.47891: variable 'network_connections' from source: include params 44071 1727204658.47992: variable 'interface' from source: play vars 44071 1727204658.48159: variable 'interface' from source: play vars 44071 1727204658.48321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204658.48976: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204658.49038: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204658.49090: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204658.49206: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204658.49281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204658.49400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204658.49473: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.49583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204658.49653: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204658.50542: variable 'network_connections' from source: include params 44071 1727204658.50546: variable 'interface' from source: play vars 44071 1727204658.50576: variable 'interface' from source: play vars 44071 1727204658.50726: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204658.50737: when evaluation is False, skipping this task 44071 1727204658.50744: _execute() done 44071 1727204658.50757: dumping result to json 44071 1727204658.50768: done dumping result, returning 44071 1727204658.50973: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001281] 44071 1727204658.50977: sending task result for task 127b8e07-fff9-c964-7471-000000001281 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204658.51333: no more pending results, returning what we have 44071 1727204658.51337: results queue empty 44071 1727204658.51338: checking for any_errors_fatal 44071 1727204658.51348: done checking for any_errors_fatal 44071 1727204658.51349: checking for max_fail_percentage 44071 1727204658.51351: done checking for max_fail_percentage 44071 1727204658.51352: checking to see if all hosts have failed and the running result is not ok 44071 1727204658.51353: done checking to see if all hosts have failed 44071 1727204658.51354: getting the remaining hosts for this loop 44071 1727204658.51356: done getting the remaining hosts for this loop 44071 1727204658.51361: getting the next task for host managed-node2 44071 1727204658.51373: done getting next task for host managed-node2 44071 1727204658.51379: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44071 1727204658.51385: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204658.51417: getting variables 44071 1727204658.51419: in VariableManager get_vars() 44071 1727204658.51772: Calling all_inventory to load vars for managed-node2 44071 1727204658.51776: Calling groups_inventory to load vars for managed-node2 44071 1727204658.51780: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204658.51794: Calling all_plugins_play to load vars for managed-node2 44071 1727204658.51798: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204658.51802: Calling groups_plugins_play to load vars for managed-node2 44071 1727204658.52491: done sending task result for task 127b8e07-fff9-c964-7471-000000001281 44071 1727204658.52496: WORKER PROCESS EXITING 44071 1727204658.54170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204658.56374: done with get_vars() 44071 1727204658.56417: done getting variables 44071 1727204658.56488: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.160) 0:01:10.881 ***** 44071 1727204658.56530: entering _queue_task() for managed-node2/package 44071 1727204658.56939: worker is 1 (out of 1 available) 44071 1727204658.56953: exiting _queue_task() for managed-node2/package 44071 1727204658.57072: done queuing things up, now waiting for results queue to drain 44071 1727204658.57074: waiting for pending results... 44071 1727204658.57303: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 44071 1727204658.57489: in run() - task 127b8e07-fff9-c964-7471-000000001282 44071 1727204658.57515: variable 'ansible_search_path' from source: unknown 44071 1727204658.57527: variable 'ansible_search_path' from source: unknown 44071 1727204658.57580: calling self._execute() 44071 1727204658.57700: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204658.57714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204658.57730: variable 'omit' from source: magic vars 44071 1727204658.58176: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.58198: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204658.58430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204658.58743: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204658.58804: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204658.58853: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204658.58956: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204658.59104: variable 'network_packages' from source: role '' defaults 44071 1727204658.59235: variable '__network_provider_setup' from source: role '' defaults 44071 1727204658.59365: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204658.59369: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204658.59375: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204658.59415: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204658.59635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204658.62406: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204658.62487: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204658.62537: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204658.62578: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204658.62631: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204658.62704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.62743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.62777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.62847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.62851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.62900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.62928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.63170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.63173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.63176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.63279: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204658.63414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.63443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.63474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.63524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.63546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.63656: variable 'ansible_python' from source: facts 44071 1727204658.63683: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204658.63781: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204658.63876: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204658.64020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.64055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.64089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.64135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.64159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.64218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.64252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.64285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.64324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.64339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.64494: variable 'network_connections' from source: include params 44071 1727204658.64506: variable 'interface' from source: play vars 44071 1727204658.64631: variable 'interface' from source: play vars 44071 1727204658.64728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204658.64763: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204658.64809: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.64872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204658.64914: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204658.65259: variable 'network_connections' from source: include params 44071 1727204658.65357: variable 'interface' from source: play vars 44071 1727204658.65393: variable 'interface' from source: play vars 44071 1727204658.65433: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204658.65530: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204658.65913: variable 'network_connections' from source: include params 44071 1727204658.65924: variable 'interface' from source: play vars 44071 1727204658.65995: variable 'interface' from source: play vars 44071 1727204658.66030: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204658.66120: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204658.66479: variable 'network_connections' from source: include params 44071 1727204658.66489: variable 'interface' from source: play vars 44071 1727204658.66563: variable 'interface' from source: play vars 44071 1727204658.66625: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204658.66696: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204658.66709: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204658.66875: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204658.67036: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204658.67599: variable 'network_connections' from source: include params 44071 1727204658.67610: variable 'interface' from source: play vars 44071 1727204658.67693: variable 'interface' from source: play vars 44071 1727204658.67755: variable 'ansible_distribution' from source: facts 44071 1727204658.67758: variable '__network_rh_distros' from source: role '' defaults 44071 1727204658.67761: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.67763: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204658.67913: variable 'ansible_distribution' from source: facts 44071 1727204658.67922: variable '__network_rh_distros' from source: role '' defaults 44071 1727204658.67930: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.67940: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204658.68122: variable 'ansible_distribution' from source: facts 44071 1727204658.68130: variable '__network_rh_distros' from source: role '' defaults 44071 1727204658.68139: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.68178: variable 'network_provider' from source: set_fact 44071 1727204658.68299: variable 'ansible_facts' from source: unknown 44071 1727204658.69098: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44071 1727204658.69107: when evaluation is False, skipping this task 44071 1727204658.69114: _execute() done 44071 1727204658.69122: dumping result to json 44071 1727204658.69129: done dumping result, returning 44071 1727204658.69142: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-c964-7471-000000001282] 44071 1727204658.69151: sending task result for task 127b8e07-fff9-c964-7471-000000001282 44071 1727204658.69386: done sending task result for task 127b8e07-fff9-c964-7471-000000001282 44071 1727204658.69390: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44071 1727204658.69449: no more pending results, returning what we have 44071 1727204658.69454: results queue empty 44071 1727204658.69455: checking for any_errors_fatal 44071 1727204658.69468: done checking for any_errors_fatal 44071 1727204658.69469: checking for max_fail_percentage 44071 1727204658.69471: done checking for max_fail_percentage 44071 1727204658.69472: checking to see if all hosts have failed and the running result is not ok 44071 1727204658.69473: done checking to see if all hosts have failed 44071 1727204658.69474: getting the remaining hosts for this loop 44071 1727204658.69476: done getting the remaining hosts for this loop 44071 1727204658.69481: getting the next task for host managed-node2 44071 1727204658.69492: done getting next task for host managed-node2 44071 1727204658.69497: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204658.69503: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204658.69534: getting variables 44071 1727204658.69536: in VariableManager get_vars() 44071 1727204658.69796: Calling all_inventory to load vars for managed-node2 44071 1727204658.69800: Calling groups_inventory to load vars for managed-node2 44071 1727204658.69802: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204658.69816: Calling all_plugins_play to load vars for managed-node2 44071 1727204658.69819: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204658.69822: Calling groups_plugins_play to load vars for managed-node2 44071 1727204658.71940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204658.74197: done with get_vars() 44071 1727204658.74246: done getting variables 44071 1727204658.74319: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.178) 0:01:11.060 ***** 44071 1727204658.74359: entering _queue_task() for managed-node2/package 44071 1727204658.74891: worker is 1 (out of 1 available) 44071 1727204658.74906: exiting _queue_task() for managed-node2/package 44071 1727204658.74918: done queuing things up, now waiting for results queue to drain 44071 1727204658.74920: waiting for pending results... 44071 1727204658.75189: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204658.75475: in run() - task 127b8e07-fff9-c964-7471-000000001283 44071 1727204658.75479: variable 'ansible_search_path' from source: unknown 44071 1727204658.75483: variable 'ansible_search_path' from source: unknown 44071 1727204658.75485: calling self._execute() 44071 1727204658.75606: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204658.75624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204658.75642: variable 'omit' from source: magic vars 44071 1727204658.76087: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.76108: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204658.76253: variable 'network_state' from source: role '' defaults 44071 1727204658.76272: Evaluated conditional (network_state != {}): False 44071 1727204658.76280: when evaluation is False, skipping this task 44071 1727204658.76287: _execute() done 44071 1727204658.76296: dumping result to json 44071 1727204658.76303: done dumping result, returning 44071 1727204658.76315: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-c964-7471-000000001283] 44071 1727204658.76325: sending task result for task 127b8e07-fff9-c964-7471-000000001283 44071 1727204658.76540: done sending task result for task 127b8e07-fff9-c964-7471-000000001283 44071 1727204658.76543: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204658.76615: no more pending results, returning what we have 44071 1727204658.76620: results queue empty 44071 1727204658.76621: checking for any_errors_fatal 44071 1727204658.76629: done checking for any_errors_fatal 44071 1727204658.76630: checking for max_fail_percentage 44071 1727204658.76633: done checking for max_fail_percentage 44071 1727204658.76634: checking to see if all hosts have failed and the running result is not ok 44071 1727204658.76635: done checking to see if all hosts have failed 44071 1727204658.76635: getting the remaining hosts for this loop 44071 1727204658.76637: done getting the remaining hosts for this loop 44071 1727204658.76643: getting the next task for host managed-node2 44071 1727204658.76653: done getting next task for host managed-node2 44071 1727204658.76659: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204658.76668: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204658.76701: getting variables 44071 1727204658.76703: in VariableManager get_vars() 44071 1727204658.76750: Calling all_inventory to load vars for managed-node2 44071 1727204658.76754: Calling groups_inventory to load vars for managed-node2 44071 1727204658.76756: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204658.76972: Calling all_plugins_play to load vars for managed-node2 44071 1727204658.76977: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204658.76981: Calling groups_plugins_play to load vars for managed-node2 44071 1727204658.78759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204658.81111: done with get_vars() 44071 1727204658.81148: done getting variables 44071 1727204658.81220: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.068) 0:01:11.129 ***** 44071 1727204658.81261: entering _queue_task() for managed-node2/package 44071 1727204658.81776: worker is 1 (out of 1 available) 44071 1727204658.81791: exiting _queue_task() for managed-node2/package 44071 1727204658.81804: done queuing things up, now waiting for results queue to drain 44071 1727204658.81806: waiting for pending results... 44071 1727204658.82189: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204658.82249: in run() - task 127b8e07-fff9-c964-7471-000000001284 44071 1727204658.82278: variable 'ansible_search_path' from source: unknown 44071 1727204658.82290: variable 'ansible_search_path' from source: unknown 44071 1727204658.82341: calling self._execute() 44071 1727204658.82470: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204658.82484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204658.82505: variable 'omit' from source: magic vars 44071 1727204658.82954: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.83045: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204658.83123: variable 'network_state' from source: role '' defaults 44071 1727204658.83142: Evaluated conditional (network_state != {}): False 44071 1727204658.83155: when evaluation is False, skipping this task 44071 1727204658.83163: _execute() done 44071 1727204658.83174: dumping result to json 44071 1727204658.83181: done dumping result, returning 44071 1727204658.83196: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-c964-7471-000000001284] 44071 1727204658.83206: sending task result for task 127b8e07-fff9-c964-7471-000000001284 44071 1727204658.83405: done sending task result for task 127b8e07-fff9-c964-7471-000000001284 44071 1727204658.83409: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204658.83462: no more pending results, returning what we have 44071 1727204658.83469: results queue empty 44071 1727204658.83470: checking for any_errors_fatal 44071 1727204658.83479: done checking for any_errors_fatal 44071 1727204658.83479: checking for max_fail_percentage 44071 1727204658.83481: done checking for max_fail_percentage 44071 1727204658.83482: checking to see if all hosts have failed and the running result is not ok 44071 1727204658.83483: done checking to see if all hosts have failed 44071 1727204658.83483: getting the remaining hosts for this loop 44071 1727204658.83485: done getting the remaining hosts for this loop 44071 1727204658.83490: getting the next task for host managed-node2 44071 1727204658.83500: done getting next task for host managed-node2 44071 1727204658.83504: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204658.83510: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204658.83534: getting variables 44071 1727204658.83535: in VariableManager get_vars() 44071 1727204658.83577: Calling all_inventory to load vars for managed-node2 44071 1727204658.83581: Calling groups_inventory to load vars for managed-node2 44071 1727204658.83583: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204658.83597: Calling all_plugins_play to load vars for managed-node2 44071 1727204658.83600: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204658.83604: Calling groups_plugins_play to load vars for managed-node2 44071 1727204658.84659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204658.86400: done with get_vars() 44071 1727204658.86440: done getting variables 44071 1727204658.86512: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.052) 0:01:11.181 ***** 44071 1727204658.86542: entering _queue_task() for managed-node2/service 44071 1727204658.86852: worker is 1 (out of 1 available) 44071 1727204658.86870: exiting _queue_task() for managed-node2/service 44071 1727204658.86884: done queuing things up, now waiting for results queue to drain 44071 1727204658.86886: waiting for pending results... 44071 1727204658.87103: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204658.87221: in run() - task 127b8e07-fff9-c964-7471-000000001285 44071 1727204658.87238: variable 'ansible_search_path' from source: unknown 44071 1727204658.87242: variable 'ansible_search_path' from source: unknown 44071 1727204658.87276: calling self._execute() 44071 1727204658.87369: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204658.87376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204658.87385: variable 'omit' from source: magic vars 44071 1727204658.87717: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.87729: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204658.87822: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204658.87976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204658.90731: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204658.90793: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204658.90823: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204658.90857: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204658.90879: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204658.90951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.90977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.90996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.91024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.91040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.91084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.91103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.91120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.91153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.91168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.91201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204658.91218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204658.91238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.91272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204658.91282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204658.91414: variable 'network_connections' from source: include params 44071 1727204658.91426: variable 'interface' from source: play vars 44071 1727204658.91487: variable 'interface' from source: play vars 44071 1727204658.91547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204658.91704: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204658.91733: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204658.91758: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204658.91783: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204658.91826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204658.91846: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204658.91864: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204658.91885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204658.91929: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204658.92209: variable 'network_connections' from source: include params 44071 1727204658.92215: variable 'interface' from source: play vars 44071 1727204658.92289: variable 'interface' from source: play vars 44071 1727204658.92359: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204658.92365: when evaluation is False, skipping this task 44071 1727204658.92368: _execute() done 44071 1727204658.92370: dumping result to json 44071 1727204658.92377: done dumping result, returning 44071 1727204658.92380: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001285] 44071 1727204658.92383: sending task result for task 127b8e07-fff9-c964-7471-000000001285 44071 1727204658.92462: done sending task result for task 127b8e07-fff9-c964-7471-000000001285 44071 1727204658.92474: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204658.92553: no more pending results, returning what we have 44071 1727204658.92556: results queue empty 44071 1727204658.92557: checking for any_errors_fatal 44071 1727204658.92564: done checking for any_errors_fatal 44071 1727204658.92564: checking for max_fail_percentage 44071 1727204658.92567: done checking for max_fail_percentage 44071 1727204658.92569: checking to see if all hosts have failed and the running result is not ok 44071 1727204658.92569: done checking to see if all hosts have failed 44071 1727204658.92570: getting the remaining hosts for this loop 44071 1727204658.92572: done getting the remaining hosts for this loop 44071 1727204658.92577: getting the next task for host managed-node2 44071 1727204658.92587: done getting next task for host managed-node2 44071 1727204658.92591: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204658.92597: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204658.92621: getting variables 44071 1727204658.92623: in VariableManager get_vars() 44071 1727204658.92791: Calling all_inventory to load vars for managed-node2 44071 1727204658.92795: Calling groups_inventory to load vars for managed-node2 44071 1727204658.92799: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204658.92810: Calling all_plugins_play to load vars for managed-node2 44071 1727204658.92812: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204658.92816: Calling groups_plugins_play to load vars for managed-node2 44071 1727204658.94712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204658.95991: done with get_vars() 44071 1727204658.96023: done getting variables 44071 1727204658.96080: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:04:18 -0400 (0:00:00.095) 0:01:11.277 ***** 44071 1727204658.96107: entering _queue_task() for managed-node2/service 44071 1727204658.96403: worker is 1 (out of 1 available) 44071 1727204658.96418: exiting _queue_task() for managed-node2/service 44071 1727204658.96432: done queuing things up, now waiting for results queue to drain 44071 1727204658.96434: waiting for pending results... 44071 1727204658.96648: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204658.96773: in run() - task 127b8e07-fff9-c964-7471-000000001286 44071 1727204658.96817: variable 'ansible_search_path' from source: unknown 44071 1727204658.96872: variable 'ansible_search_path' from source: unknown 44071 1727204658.96875: calling self._execute() 44071 1727204658.97001: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204658.97014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204658.97028: variable 'omit' from source: magic vars 44071 1727204658.97497: variable 'ansible_distribution_major_version' from source: facts 44071 1727204658.97530: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204658.97746: variable 'network_provider' from source: set_fact 44071 1727204658.97751: variable 'network_state' from source: role '' defaults 44071 1727204658.97858: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44071 1727204658.97862: variable 'omit' from source: magic vars 44071 1727204658.97867: variable 'omit' from source: magic vars 44071 1727204658.97907: variable 'network_service_name' from source: role '' defaults 44071 1727204658.97981: variable 'network_service_name' from source: role '' defaults 44071 1727204658.98085: variable '__network_provider_setup' from source: role '' defaults 44071 1727204658.98099: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204658.98144: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204658.98151: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204658.98207: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204658.98382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204659.00130: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204659.00192: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204659.00221: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204659.00264: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204659.00289: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204659.00356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204659.00385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204659.00404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204659.00433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204659.00447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204659.00492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204659.00510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204659.00528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204659.00558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204659.00570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204659.00750: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204659.00849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204659.00869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204659.00887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204659.00922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204659.00930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204659.01004: variable 'ansible_python' from source: facts 44071 1727204659.01020: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204659.01089: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204659.01153: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204659.01254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204659.01275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204659.01294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204659.01322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204659.01332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204659.01380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204659.01400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204659.01418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204659.01449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204659.01471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204659.01589: variable 'network_connections' from source: include params 44071 1727204659.01593: variable 'interface' from source: play vars 44071 1727204659.01649: variable 'interface' from source: play vars 44071 1727204659.01738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204659.01882: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204659.01925: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204659.01959: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204659.01996: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204659.02061: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204659.02086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204659.02114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204659.02140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204659.02183: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204659.02401: variable 'network_connections' from source: include params 44071 1727204659.02407: variable 'interface' from source: play vars 44071 1727204659.02474: variable 'interface' from source: play vars 44071 1727204659.02502: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204659.02569: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204659.02778: variable 'network_connections' from source: include params 44071 1727204659.02783: variable 'interface' from source: play vars 44071 1727204659.02838: variable 'interface' from source: play vars 44071 1727204659.02854: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204659.02917: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204659.03127: variable 'network_connections' from source: include params 44071 1727204659.03130: variable 'interface' from source: play vars 44071 1727204659.03185: variable 'interface' from source: play vars 44071 1727204659.03234: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204659.03277: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204659.03284: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204659.03333: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204659.03483: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204659.03838: variable 'network_connections' from source: include params 44071 1727204659.03842: variable 'interface' from source: play vars 44071 1727204659.03893: variable 'interface' from source: play vars 44071 1727204659.03900: variable 'ansible_distribution' from source: facts 44071 1727204659.03903: variable '__network_rh_distros' from source: role '' defaults 44071 1727204659.03909: variable 'ansible_distribution_major_version' from source: facts 44071 1727204659.03921: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204659.04049: variable 'ansible_distribution' from source: facts 44071 1727204659.04053: variable '__network_rh_distros' from source: role '' defaults 44071 1727204659.04058: variable 'ansible_distribution_major_version' from source: facts 44071 1727204659.04065: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204659.04194: variable 'ansible_distribution' from source: facts 44071 1727204659.04198: variable '__network_rh_distros' from source: role '' defaults 44071 1727204659.04200: variable 'ansible_distribution_major_version' from source: facts 44071 1727204659.04234: variable 'network_provider' from source: set_fact 44071 1727204659.04251: variable 'omit' from source: magic vars 44071 1727204659.04278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204659.04305: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204659.04322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204659.04337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204659.04347: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204659.04374: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204659.04378: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204659.04381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204659.04459: Set connection var ansible_connection to ssh 44071 1727204659.04465: Set connection var ansible_timeout to 10 44071 1727204659.04472: Set connection var ansible_pipelining to False 44071 1727204659.04478: Set connection var ansible_shell_type to sh 44071 1727204659.04483: Set connection var ansible_shell_executable to /bin/sh 44071 1727204659.04490: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204659.04513: variable 'ansible_shell_executable' from source: unknown 44071 1727204659.04516: variable 'ansible_connection' from source: unknown 44071 1727204659.04519: variable 'ansible_module_compression' from source: unknown 44071 1727204659.04521: variable 'ansible_shell_type' from source: unknown 44071 1727204659.04524: variable 'ansible_shell_executable' from source: unknown 44071 1727204659.04528: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204659.04530: variable 'ansible_pipelining' from source: unknown 44071 1727204659.04536: variable 'ansible_timeout' from source: unknown 44071 1727204659.04538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204659.04625: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204659.04636: variable 'omit' from source: magic vars 44071 1727204659.04639: starting attempt loop 44071 1727204659.04641: running the handler 44071 1727204659.04711: variable 'ansible_facts' from source: unknown 44071 1727204659.05521: _low_level_execute_command(): starting 44071 1727204659.05525: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204659.06084: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204659.06090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204659.06094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204659.06145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204659.06151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204659.06153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204659.06227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204659.08018: stdout chunk (state=3): >>>/root <<< 44071 1727204659.08116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204659.08181: stderr chunk (state=3): >>><<< 44071 1727204659.08185: stdout chunk (state=3): >>><<< 44071 1727204659.08210: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204659.08218: _low_level_execute_command(): starting 44071 1727204659.08225: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204659.0820556-47798-148227689220671 `" && echo ansible-tmp-1727204659.0820556-47798-148227689220671="` echo /root/.ansible/tmp/ansible-tmp-1727204659.0820556-47798-148227689220671 `" ) && sleep 0' 44071 1727204659.08738: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204659.08742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204659.08745: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204659.08749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204659.08751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204659.08807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204659.08811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204659.08818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204659.08894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204659.10896: stdout chunk (state=3): >>>ansible-tmp-1727204659.0820556-47798-148227689220671=/root/.ansible/tmp/ansible-tmp-1727204659.0820556-47798-148227689220671 <<< 44071 1727204659.10998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204659.11067: stderr chunk (state=3): >>><<< 44071 1727204659.11070: stdout chunk (state=3): >>><<< 44071 1727204659.11086: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204659.0820556-47798-148227689220671=/root/.ansible/tmp/ansible-tmp-1727204659.0820556-47798-148227689220671 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204659.11116: variable 'ansible_module_compression' from source: unknown 44071 1727204659.11161: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44071 1727204659.11215: variable 'ansible_facts' from source: unknown 44071 1727204659.11357: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204659.0820556-47798-148227689220671/AnsiballZ_systemd.py 44071 1727204659.11480: Sending initial data 44071 1727204659.11484: Sent initial data (156 bytes) 44071 1727204659.11984: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204659.11988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204659.11997: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204659.12000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204659.12056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204659.12060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204659.12064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204659.12139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204659.13755: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204659.13818: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204659.13890: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp88t6obtr /root/.ansible/tmp/ansible-tmp-1727204659.0820556-47798-148227689220671/AnsiballZ_systemd.py <<< 44071 1727204659.13897: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204659.0820556-47798-148227689220671/AnsiballZ_systemd.py" <<< 44071 1727204659.13961: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp88t6obtr" to remote "/root/.ansible/tmp/ansible-tmp-1727204659.0820556-47798-148227689220671/AnsiballZ_systemd.py" <<< 44071 1727204659.13964: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204659.0820556-47798-148227689220671/AnsiballZ_systemd.py" <<< 44071 1727204659.15239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204659.15314: stderr chunk (state=3): >>><<< 44071 1727204659.15318: stdout chunk (state=3): >>><<< 44071 1727204659.15338: done transferring module to remote 44071 1727204659.15349: _low_level_execute_command(): starting 44071 1727204659.15356: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204659.0820556-47798-148227689220671/ /root/.ansible/tmp/ansible-tmp-1727204659.0820556-47798-148227689220671/AnsiballZ_systemd.py && sleep 0' 44071 1727204659.15871: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204659.15875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204659.15878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204659.15880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204659.15885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204659.15926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204659.15953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204659.16020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204659.17870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204659.17915: stderr chunk (state=3): >>><<< 44071 1727204659.17918: stdout chunk (state=3): >>><<< 44071 1727204659.17931: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204659.17937: _low_level_execute_command(): starting 44071 1727204659.17943: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204659.0820556-47798-148227689220671/AnsiballZ_systemd.py && sleep 0' 44071 1727204659.18455: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204659.18459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204659.18461: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204659.18464: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204659.18521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204659.18526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204659.18545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204659.18622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204659.50737: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4591616", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3518767104", "CPUUsageNSec": "1560431000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitC<<< 44071 1727204659.50745: stdout chunk (state=3): >>>ORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext":<<< 44071 1727204659.50756: stdout chunk (state=3): >>> "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44071 1727204659.52747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204659.52814: stderr chunk (state=3): >>><<< 44071 1727204659.52818: stdout chunk (state=3): >>><<< 44071 1727204659.52840: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4591616", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3518767104", "CPUUsageNSec": "1560431000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204659.53145: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204659.0820556-47798-148227689220671/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204659.53148: _low_level_execute_command(): starting 44071 1727204659.53151: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204659.0820556-47798-148227689220671/ > /dev/null 2>&1 && sleep 0' 44071 1727204659.53738: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204659.53758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204659.53777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204659.53800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204659.53891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204659.53937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204659.53980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204659.54052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204659.56080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204659.56107: stdout chunk (state=3): >>><<< 44071 1727204659.56125: stderr chunk (state=3): >>><<< 44071 1727204659.56148: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204659.56163: handler run complete 44071 1727204659.56256: attempt loop complete, returning result 44071 1727204659.56267: _execute() done 44071 1727204659.56471: dumping result to json 44071 1727204659.56475: done dumping result, returning 44071 1727204659.56478: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-c964-7471-000000001286] 44071 1727204659.56480: sending task result for task 127b8e07-fff9-c964-7471-000000001286 44071 1727204659.56712: done sending task result for task 127b8e07-fff9-c964-7471-000000001286 44071 1727204659.56716: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204659.56890: no more pending results, returning what we have 44071 1727204659.56894: results queue empty 44071 1727204659.56895: checking for any_errors_fatal 44071 1727204659.56902: done checking for any_errors_fatal 44071 1727204659.56902: checking for max_fail_percentage 44071 1727204659.56904: done checking for max_fail_percentage 44071 1727204659.56905: checking to see if all hosts have failed and the running result is not ok 44071 1727204659.56906: done checking to see if all hosts have failed 44071 1727204659.56907: getting the remaining hosts for this loop 44071 1727204659.56908: done getting the remaining hosts for this loop 44071 1727204659.56913: getting the next task for host managed-node2 44071 1727204659.56929: done getting next task for host managed-node2 44071 1727204659.56936: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204659.56941: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204659.56955: getting variables 44071 1727204659.56957: in VariableManager get_vars() 44071 1727204659.56998: Calling all_inventory to load vars for managed-node2 44071 1727204659.57001: Calling groups_inventory to load vars for managed-node2 44071 1727204659.57044: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204659.57057: Calling all_plugins_play to load vars for managed-node2 44071 1727204659.57059: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204659.57062: Calling groups_plugins_play to load vars for managed-node2 44071 1727204659.58247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204659.64612: done with get_vars() 44071 1727204659.64645: done getting variables 44071 1727204659.64690: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:04:19 -0400 (0:00:00.686) 0:01:11.963 ***** 44071 1727204659.64720: entering _queue_task() for managed-node2/service 44071 1727204659.65026: worker is 1 (out of 1 available) 44071 1727204659.65043: exiting _queue_task() for managed-node2/service 44071 1727204659.65057: done queuing things up, now waiting for results queue to drain 44071 1727204659.65058: waiting for pending results... 44071 1727204659.65281: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204659.65420: in run() - task 127b8e07-fff9-c964-7471-000000001287 44071 1727204659.65433: variable 'ansible_search_path' from source: unknown 44071 1727204659.65440: variable 'ansible_search_path' from source: unknown 44071 1727204659.65476: calling self._execute() 44071 1727204659.65571: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204659.65577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204659.65586: variable 'omit' from source: magic vars 44071 1727204659.66196: variable 'ansible_distribution_major_version' from source: facts 44071 1727204659.66201: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204659.66204: variable 'network_provider' from source: set_fact 44071 1727204659.66207: Evaluated conditional (network_provider == "nm"): True 44071 1727204659.66312: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204659.66416: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204659.66612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204659.68455: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204659.68520: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204659.68551: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204659.68584: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204659.68605: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204659.68670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204659.68697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204659.68717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204659.68747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204659.68758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204659.68801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204659.68822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204659.68841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204659.68869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204659.68881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204659.68917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204659.68937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204659.68953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204659.68983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204659.68993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204659.69270: variable 'network_connections' from source: include params 44071 1727204659.69274: variable 'interface' from source: play vars 44071 1727204659.69277: variable 'interface' from source: play vars 44071 1727204659.69323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204659.69514: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204659.69576: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204659.69612: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204659.69651: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204659.69705: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204659.69728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204659.69753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204659.69781: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204659.69833: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204659.70114: variable 'network_connections' from source: include params 44071 1727204659.70124: variable 'interface' from source: play vars 44071 1727204659.70192: variable 'interface' from source: play vars 44071 1727204659.70226: Evaluated conditional (__network_wpa_supplicant_required): False 44071 1727204659.70233: when evaluation is False, skipping this task 44071 1727204659.70239: _execute() done 44071 1727204659.70246: dumping result to json 44071 1727204659.70252: done dumping result, returning 44071 1727204659.70263: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-c964-7471-000000001287] 44071 1727204659.70286: sending task result for task 127b8e07-fff9-c964-7471-000000001287 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44071 1727204659.70472: no more pending results, returning what we have 44071 1727204659.70475: results queue empty 44071 1727204659.70476: checking for any_errors_fatal 44071 1727204659.70501: done checking for any_errors_fatal 44071 1727204659.70502: checking for max_fail_percentage 44071 1727204659.70503: done checking for max_fail_percentage 44071 1727204659.70504: checking to see if all hosts have failed and the running result is not ok 44071 1727204659.70505: done checking to see if all hosts have failed 44071 1727204659.70506: getting the remaining hosts for this loop 44071 1727204659.70507: done getting the remaining hosts for this loop 44071 1727204659.70516: getting the next task for host managed-node2 44071 1727204659.70529: done getting next task for host managed-node2 44071 1727204659.70536: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204659.70541: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204659.70569: getting variables 44071 1727204659.70571: in VariableManager get_vars() 44071 1727204659.70612: Calling all_inventory to load vars for managed-node2 44071 1727204659.70615: Calling groups_inventory to load vars for managed-node2 44071 1727204659.70617: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204659.70740: done sending task result for task 127b8e07-fff9-c964-7471-000000001287 44071 1727204659.70744: WORKER PROCESS EXITING 44071 1727204659.70754: Calling all_plugins_play to load vars for managed-node2 44071 1727204659.70757: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204659.70760: Calling groups_plugins_play to load vars for managed-node2 44071 1727204659.72500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204659.74737: done with get_vars() 44071 1727204659.74782: done getting variables 44071 1727204659.74850: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:04:19 -0400 (0:00:00.101) 0:01:12.065 ***** 44071 1727204659.74893: entering _queue_task() for managed-node2/service 44071 1727204659.75302: worker is 1 (out of 1 available) 44071 1727204659.75316: exiting _queue_task() for managed-node2/service 44071 1727204659.75329: done queuing things up, now waiting for results queue to drain 44071 1727204659.75331: waiting for pending results... 44071 1727204659.75686: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204659.75884: in run() - task 127b8e07-fff9-c964-7471-000000001288 44071 1727204659.75915: variable 'ansible_search_path' from source: unknown 44071 1727204659.75925: variable 'ansible_search_path' from source: unknown 44071 1727204659.75977: calling self._execute() 44071 1727204659.76107: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204659.76122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204659.76144: variable 'omit' from source: magic vars 44071 1727204659.76623: variable 'ansible_distribution_major_version' from source: facts 44071 1727204659.76673: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204659.76818: variable 'network_provider' from source: set_fact 44071 1727204659.76830: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204659.76893: when evaluation is False, skipping this task 44071 1727204659.76897: _execute() done 44071 1727204659.76900: dumping result to json 44071 1727204659.76903: done dumping result, returning 44071 1727204659.76906: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-c964-7471-000000001288] 44071 1727204659.76908: sending task result for task 127b8e07-fff9-c964-7471-000000001288 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204659.77117: no more pending results, returning what we have 44071 1727204659.77122: results queue empty 44071 1727204659.77123: checking for any_errors_fatal 44071 1727204659.77130: done checking for any_errors_fatal 44071 1727204659.77131: checking for max_fail_percentage 44071 1727204659.77132: done checking for max_fail_percentage 44071 1727204659.77134: checking to see if all hosts have failed and the running result is not ok 44071 1727204659.77134: done checking to see if all hosts have failed 44071 1727204659.77135: getting the remaining hosts for this loop 44071 1727204659.77137: done getting the remaining hosts for this loop 44071 1727204659.77142: getting the next task for host managed-node2 44071 1727204659.77152: done getting next task for host managed-node2 44071 1727204659.77157: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204659.77163: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204659.77199: getting variables 44071 1727204659.77201: in VariableManager get_vars() 44071 1727204659.77252: Calling all_inventory to load vars for managed-node2 44071 1727204659.77256: Calling groups_inventory to load vars for managed-node2 44071 1727204659.77258: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204659.77477: Calling all_plugins_play to load vars for managed-node2 44071 1727204659.77482: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204659.77490: done sending task result for task 127b8e07-fff9-c964-7471-000000001288 44071 1727204659.77493: WORKER PROCESS EXITING 44071 1727204659.77498: Calling groups_plugins_play to load vars for managed-node2 44071 1727204659.79605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204659.81754: done with get_vars() 44071 1727204659.81800: done getting variables 44071 1727204659.81868: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:04:19 -0400 (0:00:00.070) 0:01:12.135 ***** 44071 1727204659.81907: entering _queue_task() for managed-node2/copy 44071 1727204659.82325: worker is 1 (out of 1 available) 44071 1727204659.82339: exiting _queue_task() for managed-node2/copy 44071 1727204659.82354: done queuing things up, now waiting for results queue to drain 44071 1727204659.82355: waiting for pending results... 44071 1727204659.83037: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204659.83543: in run() - task 127b8e07-fff9-c964-7471-000000001289 44071 1727204659.83548: variable 'ansible_search_path' from source: unknown 44071 1727204659.83552: variable 'ansible_search_path' from source: unknown 44071 1727204659.83692: calling self._execute() 44071 1727204659.83933: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204659.83948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204659.83989: variable 'omit' from source: magic vars 44071 1727204659.84946: variable 'ansible_distribution_major_version' from source: facts 44071 1727204659.85086: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204659.85347: variable 'network_provider' from source: set_fact 44071 1727204659.85411: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204659.85614: when evaluation is False, skipping this task 44071 1727204659.85620: _execute() done 44071 1727204659.85623: dumping result to json 44071 1727204659.85626: done dumping result, returning 44071 1727204659.85629: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-c964-7471-000000001289] 44071 1727204659.85634: sending task result for task 127b8e07-fff9-c964-7471-000000001289 44071 1727204659.85857: done sending task result for task 127b8e07-fff9-c964-7471-000000001289 44071 1727204659.85861: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44071 1727204659.85967: no more pending results, returning what we have 44071 1727204659.85974: results queue empty 44071 1727204659.85975: checking for any_errors_fatal 44071 1727204659.85987: done checking for any_errors_fatal 44071 1727204659.85988: checking for max_fail_percentage 44071 1727204659.85993: done checking for max_fail_percentage 44071 1727204659.85995: checking to see if all hosts have failed and the running result is not ok 44071 1727204659.85996: done checking to see if all hosts have failed 44071 1727204659.85997: getting the remaining hosts for this loop 44071 1727204659.85999: done getting the remaining hosts for this loop 44071 1727204659.86007: getting the next task for host managed-node2 44071 1727204659.86028: done getting next task for host managed-node2 44071 1727204659.86036: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204659.86043: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204659.86486: getting variables 44071 1727204659.86489: in VariableManager get_vars() 44071 1727204659.86550: Calling all_inventory to load vars for managed-node2 44071 1727204659.86553: Calling groups_inventory to load vars for managed-node2 44071 1727204659.86556: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204659.86570: Calling all_plugins_play to load vars for managed-node2 44071 1727204659.86574: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204659.86581: Calling groups_plugins_play to load vars for managed-node2 44071 1727204659.89394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204659.93047: done with get_vars() 44071 1727204659.93131: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:04:19 -0400 (0:00:00.113) 0:01:12.249 ***** 44071 1727204659.93299: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204659.94309: worker is 1 (out of 1 available) 44071 1727204659.94324: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204659.94339: done queuing things up, now waiting for results queue to drain 44071 1727204659.94340: waiting for pending results... 44071 1727204659.94869: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204659.95034: in run() - task 127b8e07-fff9-c964-7471-00000000128a 44071 1727204659.95039: variable 'ansible_search_path' from source: unknown 44071 1727204659.95043: variable 'ansible_search_path' from source: unknown 44071 1727204659.95046: calling self._execute() 44071 1727204659.95421: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204659.95426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204659.95442: variable 'omit' from source: magic vars 44071 1727204659.96269: variable 'ansible_distribution_major_version' from source: facts 44071 1727204659.96782: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204659.96786: variable 'omit' from source: magic vars 44071 1727204659.96862: variable 'omit' from source: magic vars 44071 1727204659.97673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204660.01642: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204660.01723: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204660.01804: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204660.01808: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204660.01830: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204660.02130: variable 'network_provider' from source: set_fact 44071 1727204660.02974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204660.02979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204660.02981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204660.03003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204660.03121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204660.03215: variable 'omit' from source: magic vars 44071 1727204660.03393: variable 'omit' from source: magic vars 44071 1727204660.03671: variable 'network_connections' from source: include params 44071 1727204660.03679: variable 'interface' from source: play vars 44071 1727204660.03753: variable 'interface' from source: play vars 44071 1727204660.04344: variable 'omit' from source: magic vars 44071 1727204660.04354: variable '__lsr_ansible_managed' from source: task vars 44071 1727204660.04453: variable '__lsr_ansible_managed' from source: task vars 44071 1727204660.04881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44071 1727204660.05345: Loaded config def from plugin (lookup/template) 44071 1727204660.05349: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44071 1727204660.05585: File lookup term: get_ansible_managed.j2 44071 1727204660.05589: variable 'ansible_search_path' from source: unknown 44071 1727204660.05593: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44071 1727204660.05609: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44071 1727204660.05630: variable 'ansible_search_path' from source: unknown 44071 1727204660.17131: variable 'ansible_managed' from source: unknown 44071 1727204660.17302: variable 'omit' from source: magic vars 44071 1727204660.17318: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204660.17454: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204660.17458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204660.17460: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204660.17462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204660.17703: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204660.17706: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204660.17709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204660.18063: Set connection var ansible_connection to ssh 44071 1727204660.18068: Set connection var ansible_timeout to 10 44071 1727204660.18070: Set connection var ansible_pipelining to False 44071 1727204660.18072: Set connection var ansible_shell_type to sh 44071 1727204660.18074: Set connection var ansible_shell_executable to /bin/sh 44071 1727204660.18076: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204660.18108: variable 'ansible_shell_executable' from source: unknown 44071 1727204660.18111: variable 'ansible_connection' from source: unknown 44071 1727204660.18114: variable 'ansible_module_compression' from source: unknown 44071 1727204660.18116: variable 'ansible_shell_type' from source: unknown 44071 1727204660.18119: variable 'ansible_shell_executable' from source: unknown 44071 1727204660.18122: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204660.18124: variable 'ansible_pipelining' from source: unknown 44071 1727204660.18126: variable 'ansible_timeout' from source: unknown 44071 1727204660.18217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204660.18546: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204660.18558: variable 'omit' from source: magic vars 44071 1727204660.18561: starting attempt loop 44071 1727204660.18563: running the handler 44071 1727204660.18569: _low_level_execute_command(): starting 44071 1727204660.18571: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204660.19790: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204660.19823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204660.19919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204660.19922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204660.20096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204660.21864: stdout chunk (state=3): >>>/root <<< 44071 1727204660.22110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204660.22123: stdout chunk (state=3): >>><<< 44071 1727204660.22136: stderr chunk (state=3): >>><<< 44071 1727204660.22240: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204660.22369: _low_level_execute_command(): starting 44071 1727204660.22377: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204660.2224119-47841-184767933911094 `" && echo ansible-tmp-1727204660.2224119-47841-184767933911094="` echo /root/.ansible/tmp/ansible-tmp-1727204660.2224119-47841-184767933911094 `" ) && sleep 0' 44071 1727204660.23646: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204660.23870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204660.23876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204660.23879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204660.24156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204660.26125: stdout chunk (state=3): >>>ansible-tmp-1727204660.2224119-47841-184767933911094=/root/.ansible/tmp/ansible-tmp-1727204660.2224119-47841-184767933911094 <<< 44071 1727204660.26274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204660.26494: stderr chunk (state=3): >>><<< 44071 1727204660.26498: stdout chunk (state=3): >>><<< 44071 1727204660.26712: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204660.2224119-47841-184767933911094=/root/.ansible/tmp/ansible-tmp-1727204660.2224119-47841-184767933911094 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204660.26716: variable 'ansible_module_compression' from source: unknown 44071 1727204660.26848: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44071 1727204660.26953: variable 'ansible_facts' from source: unknown 44071 1727204660.27119: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204660.2224119-47841-184767933911094/AnsiballZ_network_connections.py 44071 1727204660.27468: Sending initial data 44071 1727204660.27585: Sent initial data (168 bytes) 44071 1727204660.28544: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204660.28564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204660.28789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204660.28899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204660.28903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204660.28919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204660.29063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204660.30641: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204660.30739: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204660.30782: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204660.30892: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpesgxiaz3 /root/.ansible/tmp/ansible-tmp-1727204660.2224119-47841-184767933911094/AnsiballZ_network_connections.py <<< 44071 1727204660.30908: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204660.2224119-47841-184767933911094/AnsiballZ_network_connections.py" <<< 44071 1727204660.30986: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpesgxiaz3" to remote "/root/.ansible/tmp/ansible-tmp-1727204660.2224119-47841-184767933911094/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204660.2224119-47841-184767933911094/AnsiballZ_network_connections.py" <<< 44071 1727204660.33730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204660.33974: stderr chunk (state=3): >>><<< 44071 1727204660.33978: stdout chunk (state=3): >>><<< 44071 1727204660.34176: done transferring module to remote 44071 1727204660.34179: _low_level_execute_command(): starting 44071 1727204660.34182: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204660.2224119-47841-184767933911094/ /root/.ansible/tmp/ansible-tmp-1727204660.2224119-47841-184767933911094/AnsiballZ_network_connections.py && sleep 0' 44071 1727204660.35146: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204660.35199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204660.35214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204660.35282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204660.35339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204660.35360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204660.35378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204660.35485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204660.37674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204660.37678: stdout chunk (state=3): >>><<< 44071 1727204660.37680: stderr chunk (state=3): >>><<< 44071 1727204660.37683: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204660.37685: _low_level_execute_command(): starting 44071 1727204660.37687: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204660.2224119-47841-184767933911094/AnsiballZ_network_connections.py && sleep 0' 44071 1727204660.38701: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204660.38846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204660.38859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204660.38904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204660.39070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204660.66324: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, beccd2e1-72f3-4d73-aac6-77978c2859f8 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44071 1727204660.68301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204660.68305: stdout chunk (state=3): >>><<< 44071 1727204660.68307: stderr chunk (state=3): >>><<< 44071 1727204660.68461: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, beccd2e1-72f3-4d73-aac6-77978c2859f8 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204660.68470: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204660.2224119-47841-184767933911094/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204660.68473: _low_level_execute_command(): starting 44071 1727204660.68475: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204660.2224119-47841-184767933911094/ > /dev/null 2>&1 && sleep 0' 44071 1727204660.69088: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204660.69142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204660.69160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204660.69190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204660.69285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204660.71334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204660.71375: stdout chunk (state=3): >>><<< 44071 1727204660.71379: stderr chunk (state=3): >>><<< 44071 1727204660.71521: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204660.71525: handler run complete 44071 1727204660.71528: attempt loop complete, returning result 44071 1727204660.71530: _execute() done 44071 1727204660.71535: dumping result to json 44071 1727204660.71538: done dumping result, returning 44071 1727204660.71540: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-c964-7471-00000000128a] 44071 1727204660.71542: sending task result for task 127b8e07-fff9-c964-7471-00000000128a 44071 1727204660.71628: done sending task result for task 127b8e07-fff9-c964-7471-00000000128a 44071 1727204660.71634: WORKER PROCESS EXITING ok: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, beccd2e1-72f3-4d73-aac6-77978c2859f8 skipped because already active 44071 1727204660.71761: no more pending results, returning what we have 44071 1727204660.71768: results queue empty 44071 1727204660.71769: checking for any_errors_fatal 44071 1727204660.71776: done checking for any_errors_fatal 44071 1727204660.71778: checking for max_fail_percentage 44071 1727204660.71780: done checking for max_fail_percentage 44071 1727204660.71781: checking to see if all hosts have failed and the running result is not ok 44071 1727204660.71782: done checking to see if all hosts have failed 44071 1727204660.71788: getting the remaining hosts for this loop 44071 1727204660.71790: done getting the remaining hosts for this loop 44071 1727204660.71795: getting the next task for host managed-node2 44071 1727204660.71804: done getting next task for host managed-node2 44071 1727204660.71809: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204660.71814: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204660.71828: getting variables 44071 1727204660.71830: in VariableManager get_vars() 44071 1727204660.72080: Calling all_inventory to load vars for managed-node2 44071 1727204660.72084: Calling groups_inventory to load vars for managed-node2 44071 1727204660.72115: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204660.72128: Calling all_plugins_play to load vars for managed-node2 44071 1727204660.72134: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204660.72138: Calling groups_plugins_play to load vars for managed-node2 44071 1727204660.74360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204660.75605: done with get_vars() 44071 1727204660.75640: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:04:20 -0400 (0:00:00.824) 0:01:13.073 ***** 44071 1727204660.75714: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204660.76159: worker is 1 (out of 1 available) 44071 1727204660.76178: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204660.76198: done queuing things up, now waiting for results queue to drain 44071 1727204660.76200: waiting for pending results... 44071 1727204660.76614: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204660.76709: in run() - task 127b8e07-fff9-c964-7471-00000000128b 44071 1727204660.76713: variable 'ansible_search_path' from source: unknown 44071 1727204660.76716: variable 'ansible_search_path' from source: unknown 44071 1727204660.76740: calling self._execute() 44071 1727204660.76923: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204660.76946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204660.76950: variable 'omit' from source: magic vars 44071 1727204660.77343: variable 'ansible_distribution_major_version' from source: facts 44071 1727204660.77354: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204660.77450: variable 'network_state' from source: role '' defaults 44071 1727204660.77462: Evaluated conditional (network_state != {}): False 44071 1727204660.77466: when evaluation is False, skipping this task 44071 1727204660.77471: _execute() done 44071 1727204660.77473: dumping result to json 44071 1727204660.77476: done dumping result, returning 44071 1727204660.77482: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-c964-7471-00000000128b] 44071 1727204660.77488: sending task result for task 127b8e07-fff9-c964-7471-00000000128b 44071 1727204660.77589: done sending task result for task 127b8e07-fff9-c964-7471-00000000128b 44071 1727204660.77592: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204660.77649: no more pending results, returning what we have 44071 1727204660.77653: results queue empty 44071 1727204660.77654: checking for any_errors_fatal 44071 1727204660.77664: done checking for any_errors_fatal 44071 1727204660.77666: checking for max_fail_percentage 44071 1727204660.77668: done checking for max_fail_percentage 44071 1727204660.77670: checking to see if all hosts have failed and the running result is not ok 44071 1727204660.77670: done checking to see if all hosts have failed 44071 1727204660.77671: getting the remaining hosts for this loop 44071 1727204660.77673: done getting the remaining hosts for this loop 44071 1727204660.77678: getting the next task for host managed-node2 44071 1727204660.77686: done getting next task for host managed-node2 44071 1727204660.77691: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204660.77696: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204660.77722: getting variables 44071 1727204660.77724: in VariableManager get_vars() 44071 1727204660.77764: Calling all_inventory to load vars for managed-node2 44071 1727204660.77774: Calling groups_inventory to load vars for managed-node2 44071 1727204660.77777: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204660.77789: Calling all_plugins_play to load vars for managed-node2 44071 1727204660.77791: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204660.77794: Calling groups_plugins_play to load vars for managed-node2 44071 1727204660.78956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204660.80854: done with get_vars() 44071 1727204660.80899: done getting variables 44071 1727204660.80974: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:04:20 -0400 (0:00:00.052) 0:01:13.126 ***** 44071 1727204660.81009: entering _queue_task() for managed-node2/debug 44071 1727204660.81337: worker is 1 (out of 1 available) 44071 1727204660.81351: exiting _queue_task() for managed-node2/debug 44071 1727204660.81368: done queuing things up, now waiting for results queue to drain 44071 1727204660.81370: waiting for pending results... 44071 1727204660.81584: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204660.81712: in run() - task 127b8e07-fff9-c964-7471-00000000128c 44071 1727204660.81723: variable 'ansible_search_path' from source: unknown 44071 1727204660.81727: variable 'ansible_search_path' from source: unknown 44071 1727204660.81764: calling self._execute() 44071 1727204660.81850: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204660.81855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204660.81864: variable 'omit' from source: magic vars 44071 1727204660.82199: variable 'ansible_distribution_major_version' from source: facts 44071 1727204660.82210: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204660.82217: variable 'omit' from source: magic vars 44071 1727204660.82275: variable 'omit' from source: magic vars 44071 1727204660.82302: variable 'omit' from source: magic vars 44071 1727204660.82340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204660.82376: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204660.82393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204660.82409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204660.82420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204660.82447: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204660.82451: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204660.82454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204660.82530: Set connection var ansible_connection to ssh 44071 1727204660.82538: Set connection var ansible_timeout to 10 44071 1727204660.82544: Set connection var ansible_pipelining to False 44071 1727204660.82549: Set connection var ansible_shell_type to sh 44071 1727204660.82555: Set connection var ansible_shell_executable to /bin/sh 44071 1727204660.82564: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204660.82587: variable 'ansible_shell_executable' from source: unknown 44071 1727204660.82591: variable 'ansible_connection' from source: unknown 44071 1727204660.82594: variable 'ansible_module_compression' from source: unknown 44071 1727204660.82597: variable 'ansible_shell_type' from source: unknown 44071 1727204660.82599: variable 'ansible_shell_executable' from source: unknown 44071 1727204660.82601: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204660.82604: variable 'ansible_pipelining' from source: unknown 44071 1727204660.82606: variable 'ansible_timeout' from source: unknown 44071 1727204660.82612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204660.82730: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204660.82743: variable 'omit' from source: magic vars 44071 1727204660.82747: starting attempt loop 44071 1727204660.82750: running the handler 44071 1727204660.82892: variable '__network_connections_result' from source: set_fact 44071 1727204660.82971: handler run complete 44071 1727204660.82975: attempt loop complete, returning result 44071 1727204660.82978: _execute() done 44071 1727204660.82981: dumping result to json 44071 1727204660.82983: done dumping result, returning 44071 1727204660.82996: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-c964-7471-00000000128c] 44071 1727204660.83004: sending task result for task 127b8e07-fff9-c964-7471-00000000128c 44071 1727204660.83114: done sending task result for task 127b8e07-fff9-c964-7471-00000000128c 44071 1727204660.83117: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, beccd2e1-72f3-4d73-aac6-77978c2859f8 skipped because already active" ] } 44071 1727204660.83212: no more pending results, returning what we have 44071 1727204660.83218: results queue empty 44071 1727204660.83218: checking for any_errors_fatal 44071 1727204660.83227: done checking for any_errors_fatal 44071 1727204660.83227: checking for max_fail_percentage 44071 1727204660.83229: done checking for max_fail_percentage 44071 1727204660.83230: checking to see if all hosts have failed and the running result is not ok 44071 1727204660.83230: done checking to see if all hosts have failed 44071 1727204660.83233: getting the remaining hosts for this loop 44071 1727204660.83235: done getting the remaining hosts for this loop 44071 1727204660.83240: getting the next task for host managed-node2 44071 1727204660.83248: done getting next task for host managed-node2 44071 1727204660.83253: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204660.83259: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204660.83274: getting variables 44071 1727204660.83276: in VariableManager get_vars() 44071 1727204660.83319: Calling all_inventory to load vars for managed-node2 44071 1727204660.83322: Calling groups_inventory to load vars for managed-node2 44071 1727204660.83325: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204660.83339: Calling all_plugins_play to load vars for managed-node2 44071 1727204660.83342: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204660.83344: Calling groups_plugins_play to load vars for managed-node2 44071 1727204660.85315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204660.86553: done with get_vars() 44071 1727204660.86581: done getting variables 44071 1727204660.86630: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:04:20 -0400 (0:00:00.056) 0:01:13.183 ***** 44071 1727204660.86669: entering _queue_task() for managed-node2/debug 44071 1727204660.86962: worker is 1 (out of 1 available) 44071 1727204660.86979: exiting _queue_task() for managed-node2/debug 44071 1727204660.86993: done queuing things up, now waiting for results queue to drain 44071 1727204660.86995: waiting for pending results... 44071 1727204660.87206: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204660.87312: in run() - task 127b8e07-fff9-c964-7471-00000000128d 44071 1727204660.87327: variable 'ansible_search_path' from source: unknown 44071 1727204660.87330: variable 'ansible_search_path' from source: unknown 44071 1727204660.87390: calling self._execute() 44071 1727204660.87573: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204660.87579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204660.87582: variable 'omit' from source: magic vars 44071 1727204660.88041: variable 'ansible_distribution_major_version' from source: facts 44071 1727204660.88063: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204660.88078: variable 'omit' from source: magic vars 44071 1727204660.88180: variable 'omit' from source: magic vars 44071 1727204660.88224: variable 'omit' from source: magic vars 44071 1727204660.88292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204660.88357: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204660.88379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204660.88474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204660.88477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204660.88479: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204660.88482: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204660.88484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204660.88613: Set connection var ansible_connection to ssh 44071 1727204660.88616: Set connection var ansible_timeout to 10 44071 1727204660.88618: Set connection var ansible_pipelining to False 44071 1727204660.88620: Set connection var ansible_shell_type to sh 44071 1727204660.88622: Set connection var ansible_shell_executable to /bin/sh 44071 1727204660.88624: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204660.88646: variable 'ansible_shell_executable' from source: unknown 44071 1727204660.88655: variable 'ansible_connection' from source: unknown 44071 1727204660.88658: variable 'ansible_module_compression' from source: unknown 44071 1727204660.88660: variable 'ansible_shell_type' from source: unknown 44071 1727204660.88667: variable 'ansible_shell_executable' from source: unknown 44071 1727204660.88669: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204660.88674: variable 'ansible_pipelining' from source: unknown 44071 1727204660.88676: variable 'ansible_timeout' from source: unknown 44071 1727204660.88681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204660.88801: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204660.88812: variable 'omit' from source: magic vars 44071 1727204660.88818: starting attempt loop 44071 1727204660.88822: running the handler 44071 1727204660.88865: variable '__network_connections_result' from source: set_fact 44071 1727204660.88938: variable '__network_connections_result' from source: set_fact 44071 1727204660.89029: handler run complete 44071 1727204660.89049: attempt loop complete, returning result 44071 1727204660.89052: _execute() done 44071 1727204660.89055: dumping result to json 44071 1727204660.89060: done dumping result, returning 44071 1727204660.89069: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-c964-7471-00000000128d] 44071 1727204660.89074: sending task result for task 127b8e07-fff9-c964-7471-00000000128d 44071 1727204660.89177: done sending task result for task 127b8e07-fff9-c964-7471-00000000128d 44071 1727204660.89181: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, beccd2e1-72f3-4d73-aac6-77978c2859f8 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, beccd2e1-72f3-4d73-aac6-77978c2859f8 skipped because already active" ] } } 44071 1727204660.89303: no more pending results, returning what we have 44071 1727204660.89307: results queue empty 44071 1727204660.89307: checking for any_errors_fatal 44071 1727204660.89313: done checking for any_errors_fatal 44071 1727204660.89313: checking for max_fail_percentage 44071 1727204660.89315: done checking for max_fail_percentage 44071 1727204660.89316: checking to see if all hosts have failed and the running result is not ok 44071 1727204660.89316: done checking to see if all hosts have failed 44071 1727204660.89317: getting the remaining hosts for this loop 44071 1727204660.89319: done getting the remaining hosts for this loop 44071 1727204660.89322: getting the next task for host managed-node2 44071 1727204660.89329: done getting next task for host managed-node2 44071 1727204660.89335: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204660.89340: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204660.89353: getting variables 44071 1727204660.89354: in VariableManager get_vars() 44071 1727204660.89395: Calling all_inventory to load vars for managed-node2 44071 1727204660.89398: Calling groups_inventory to load vars for managed-node2 44071 1727204660.89406: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204660.89415: Calling all_plugins_play to load vars for managed-node2 44071 1727204660.89418: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204660.89420: Calling groups_plugins_play to load vars for managed-node2 44071 1727204660.90472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204660.91716: done with get_vars() 44071 1727204660.91750: done getting variables 44071 1727204660.91803: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:04:20 -0400 (0:00:00.051) 0:01:13.234 ***** 44071 1727204660.91834: entering _queue_task() for managed-node2/debug 44071 1727204660.92137: worker is 1 (out of 1 available) 44071 1727204660.92153: exiting _queue_task() for managed-node2/debug 44071 1727204660.92169: done queuing things up, now waiting for results queue to drain 44071 1727204660.92171: waiting for pending results... 44071 1727204660.92375: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204660.92497: in run() - task 127b8e07-fff9-c964-7471-00000000128e 44071 1727204660.92514: variable 'ansible_search_path' from source: unknown 44071 1727204660.92517: variable 'ansible_search_path' from source: unknown 44071 1727204660.92555: calling self._execute() 44071 1727204660.92647: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204660.92651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204660.92661: variable 'omit' from source: magic vars 44071 1727204660.92999: variable 'ansible_distribution_major_version' from source: facts 44071 1727204660.93009: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204660.93113: variable 'network_state' from source: role '' defaults 44071 1727204660.93123: Evaluated conditional (network_state != {}): False 44071 1727204660.93127: when evaluation is False, skipping this task 44071 1727204660.93130: _execute() done 44071 1727204660.93133: dumping result to json 44071 1727204660.93139: done dumping result, returning 44071 1727204660.93146: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-c964-7471-00000000128e] 44071 1727204660.93153: sending task result for task 127b8e07-fff9-c964-7471-00000000128e 44071 1727204660.93253: done sending task result for task 127b8e07-fff9-c964-7471-00000000128e 44071 1727204660.93257: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 44071 1727204660.93315: no more pending results, returning what we have 44071 1727204660.93320: results queue empty 44071 1727204660.93321: checking for any_errors_fatal 44071 1727204660.93329: done checking for any_errors_fatal 44071 1727204660.93330: checking for max_fail_percentage 44071 1727204660.93331: done checking for max_fail_percentage 44071 1727204660.93332: checking to see if all hosts have failed and the running result is not ok 44071 1727204660.93333: done checking to see if all hosts have failed 44071 1727204660.93334: getting the remaining hosts for this loop 44071 1727204660.93336: done getting the remaining hosts for this loop 44071 1727204660.93340: getting the next task for host managed-node2 44071 1727204660.93349: done getting next task for host managed-node2 44071 1727204660.93353: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204660.93359: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204660.93388: getting variables 44071 1727204660.93390: in VariableManager get_vars() 44071 1727204660.93428: Calling all_inventory to load vars for managed-node2 44071 1727204660.93431: Calling groups_inventory to load vars for managed-node2 44071 1727204660.93433: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204660.93446: Calling all_plugins_play to load vars for managed-node2 44071 1727204660.93449: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204660.93452: Calling groups_plugins_play to load vars for managed-node2 44071 1727204660.94653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204660.95856: done with get_vars() 44071 1727204660.95892: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:04:20 -0400 (0:00:00.041) 0:01:13.276 ***** 44071 1727204660.95977: entering _queue_task() for managed-node2/ping 44071 1727204660.96280: worker is 1 (out of 1 available) 44071 1727204660.96295: exiting _queue_task() for managed-node2/ping 44071 1727204660.96310: done queuing things up, now waiting for results queue to drain 44071 1727204660.96311: waiting for pending results... 44071 1727204660.96524: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204660.96642: in run() - task 127b8e07-fff9-c964-7471-00000000128f 44071 1727204660.96658: variable 'ansible_search_path' from source: unknown 44071 1727204660.96661: variable 'ansible_search_path' from source: unknown 44071 1727204660.96694: calling self._execute() 44071 1727204660.96780: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204660.96786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204660.96795: variable 'omit' from source: magic vars 44071 1727204660.97116: variable 'ansible_distribution_major_version' from source: facts 44071 1727204660.97126: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204660.97133: variable 'omit' from source: magic vars 44071 1727204660.97187: variable 'omit' from source: magic vars 44071 1727204660.97216: variable 'omit' from source: magic vars 44071 1727204660.97255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204660.97287: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204660.97308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204660.97325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204660.97338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204660.97364: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204660.97369: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204660.97372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204660.97451: Set connection var ansible_connection to ssh 44071 1727204660.97455: Set connection var ansible_timeout to 10 44071 1727204660.97461: Set connection var ansible_pipelining to False 44071 1727204660.97468: Set connection var ansible_shell_type to sh 44071 1727204660.97474: Set connection var ansible_shell_executable to /bin/sh 44071 1727204660.97481: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204660.97500: variable 'ansible_shell_executable' from source: unknown 44071 1727204660.97503: variable 'ansible_connection' from source: unknown 44071 1727204660.97506: variable 'ansible_module_compression' from source: unknown 44071 1727204660.97509: variable 'ansible_shell_type' from source: unknown 44071 1727204660.97511: variable 'ansible_shell_executable' from source: unknown 44071 1727204660.97516: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204660.97518: variable 'ansible_pipelining' from source: unknown 44071 1727204660.97520: variable 'ansible_timeout' from source: unknown 44071 1727204660.97527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204660.97703: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204660.97713: variable 'omit' from source: magic vars 44071 1727204660.97717: starting attempt loop 44071 1727204660.97720: running the handler 44071 1727204660.97733: _low_level_execute_command(): starting 44071 1727204660.97744: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204660.98308: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204660.98312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204660.98316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204660.98375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204660.98378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204660.98380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204660.98460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204661.00235: stdout chunk (state=3): >>>/root <<< 44071 1727204661.00342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204661.00406: stderr chunk (state=3): >>><<< 44071 1727204661.00410: stdout chunk (state=3): >>><<< 44071 1727204661.00432: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204661.00449: _low_level_execute_command(): starting 44071 1727204661.00456: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204661.0043504-47931-221087765760375 `" && echo ansible-tmp-1727204661.0043504-47931-221087765760375="` echo /root/.ansible/tmp/ansible-tmp-1727204661.0043504-47931-221087765760375 `" ) && sleep 0' 44071 1727204661.00944: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204661.00976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204661.00979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204661.00990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204661.00992: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204661.01049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204661.01053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204661.01057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204661.01119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204661.03116: stdout chunk (state=3): >>>ansible-tmp-1727204661.0043504-47931-221087765760375=/root/.ansible/tmp/ansible-tmp-1727204661.0043504-47931-221087765760375 <<< 44071 1727204661.03238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204661.03299: stderr chunk (state=3): >>><<< 44071 1727204661.03302: stdout chunk (state=3): >>><<< 44071 1727204661.03321: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204661.0043504-47931-221087765760375=/root/.ansible/tmp/ansible-tmp-1727204661.0043504-47931-221087765760375 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204661.03372: variable 'ansible_module_compression' from source: unknown 44071 1727204661.03409: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44071 1727204661.03446: variable 'ansible_facts' from source: unknown 44071 1727204661.03502: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204661.0043504-47931-221087765760375/AnsiballZ_ping.py 44071 1727204661.03623: Sending initial data 44071 1727204661.03627: Sent initial data (153 bytes) 44071 1727204661.04136: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204661.04140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204661.04143: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204661.04145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204661.04147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204661.04206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204661.04210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204661.04213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204661.04299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204661.05913: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204661.05975: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204661.06047: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpfzcjkno5 /root/.ansible/tmp/ansible-tmp-1727204661.0043504-47931-221087765760375/AnsiballZ_ping.py <<< 44071 1727204661.06053: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204661.0043504-47931-221087765760375/AnsiballZ_ping.py" <<< 44071 1727204661.06118: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpfzcjkno5" to remote "/root/.ansible/tmp/ansible-tmp-1727204661.0043504-47931-221087765760375/AnsiballZ_ping.py" <<< 44071 1727204661.06122: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204661.0043504-47931-221087765760375/AnsiballZ_ping.py" <<< 44071 1727204661.06784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204661.06861: stderr chunk (state=3): >>><<< 44071 1727204661.06864: stdout chunk (state=3): >>><<< 44071 1727204661.06889: done transferring module to remote 44071 1727204661.06900: _low_level_execute_command(): starting 44071 1727204661.06903: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204661.0043504-47931-221087765760375/ /root/.ansible/tmp/ansible-tmp-1727204661.0043504-47931-221087765760375/AnsiballZ_ping.py && sleep 0' 44071 1727204661.07372: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204661.07403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204661.07407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204661.07409: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204661.07412: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204661.07418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204661.07472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204661.07476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204661.07486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204661.07553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204661.09398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204661.09453: stderr chunk (state=3): >>><<< 44071 1727204661.09457: stdout chunk (state=3): >>><<< 44071 1727204661.09472: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204661.09475: _low_level_execute_command(): starting 44071 1727204661.09481: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204661.0043504-47931-221087765760375/AnsiballZ_ping.py && sleep 0' 44071 1727204661.09954: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204661.09959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204661.09987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204661.09991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204661.10047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204661.10051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204661.10136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204661.27183: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44071 1727204661.28623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204661.28628: stdout chunk (state=3): >>><<< 44071 1727204661.28630: stderr chunk (state=3): >>><<< 44071 1727204661.28657: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204661.28730: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204661.0043504-47931-221087765760375/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204661.28737: _low_level_execute_command(): starting 44071 1727204661.28740: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204661.0043504-47931-221087765760375/ > /dev/null 2>&1 && sleep 0' 44071 1727204661.29720: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204661.29746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204661.29762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204661.29784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204661.29803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204661.29868: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204661.29927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204661.29949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204661.29987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204661.30093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204661.32168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204661.32176: stdout chunk (state=3): >>><<< 44071 1727204661.32179: stderr chunk (state=3): >>><<< 44071 1727204661.32198: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204661.32208: handler run complete 44071 1727204661.32373: attempt loop complete, returning result 44071 1727204661.32377: _execute() done 44071 1727204661.32380: dumping result to json 44071 1727204661.32382: done dumping result, returning 44071 1727204661.32384: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-c964-7471-00000000128f] 44071 1727204661.32386: sending task result for task 127b8e07-fff9-c964-7471-00000000128f 44071 1727204661.32479: done sending task result for task 127b8e07-fff9-c964-7471-00000000128f 44071 1727204661.32482: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 44071 1727204661.32584: no more pending results, returning what we have 44071 1727204661.32590: results queue empty 44071 1727204661.32591: checking for any_errors_fatal 44071 1727204661.32599: done checking for any_errors_fatal 44071 1727204661.32600: checking for max_fail_percentage 44071 1727204661.32602: done checking for max_fail_percentage 44071 1727204661.32603: checking to see if all hosts have failed and the running result is not ok 44071 1727204661.32604: done checking to see if all hosts have failed 44071 1727204661.32604: getting the remaining hosts for this loop 44071 1727204661.32606: done getting the remaining hosts for this loop 44071 1727204661.32612: getting the next task for host managed-node2 44071 1727204661.32624: done getting next task for host managed-node2 44071 1727204661.32627: ^ task is: TASK: meta (role_complete) 44071 1727204661.32637: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204661.32654: getting variables 44071 1727204661.32656: in VariableManager get_vars() 44071 1727204661.32820: Calling all_inventory to load vars for managed-node2 44071 1727204661.32823: Calling groups_inventory to load vars for managed-node2 44071 1727204661.32826: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204661.32842: Calling all_plugins_play to load vars for managed-node2 44071 1727204661.32845: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204661.32849: Calling groups_plugins_play to load vars for managed-node2 44071 1727204661.37255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204661.40228: done with get_vars() 44071 1727204661.40377: done getting variables 44071 1727204661.40581: done queuing things up, now waiting for results queue to drain 44071 1727204661.40583: results queue empty 44071 1727204661.40584: checking for any_errors_fatal 44071 1727204661.40588: done checking for any_errors_fatal 44071 1727204661.40589: checking for max_fail_percentage 44071 1727204661.40590: done checking for max_fail_percentage 44071 1727204661.40591: checking to see if all hosts have failed and the running result is not ok 44071 1727204661.40592: done checking to see if all hosts have failed 44071 1727204661.40593: getting the remaining hosts for this loop 44071 1727204661.40594: done getting the remaining hosts for this loop 44071 1727204661.40597: getting the next task for host managed-node2 44071 1727204661.40603: done getting next task for host managed-node2 44071 1727204661.40606: ^ task is: TASK: Test 44071 1727204661.40609: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204661.40612: getting variables 44071 1727204661.40613: in VariableManager get_vars() 44071 1727204661.40627: Calling all_inventory to load vars for managed-node2 44071 1727204661.40629: Calling groups_inventory to load vars for managed-node2 44071 1727204661.40639: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204661.40646: Calling all_plugins_play to load vars for managed-node2 44071 1727204661.40648: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204661.40651: Calling groups_plugins_play to load vars for managed-node2 44071 1727204661.43287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204661.46158: done with get_vars() 44071 1727204661.46211: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 15:04:21 -0400 (0:00:00.504) 0:01:13.781 ***** 44071 1727204661.46461: entering _queue_task() for managed-node2/include_tasks 44071 1727204661.47471: worker is 1 (out of 1 available) 44071 1727204661.47548: exiting _queue_task() for managed-node2/include_tasks 44071 1727204661.47564: done queuing things up, now waiting for results queue to drain 44071 1727204661.47568: waiting for pending results... 44071 1727204661.48421: running TaskExecutor() for managed-node2/TASK: Test 44071 1727204661.48453: in run() - task 127b8e07-fff9-c964-7471-000000001009 44071 1727204661.48469: variable 'ansible_search_path' from source: unknown 44071 1727204661.48474: variable 'ansible_search_path' from source: unknown 44071 1727204661.48527: variable 'lsr_test' from source: include params 44071 1727204661.48769: variable 'lsr_test' from source: include params 44071 1727204661.48844: variable 'omit' from source: magic vars 44071 1727204661.49040: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204661.49057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204661.49070: variable 'omit' from source: magic vars 44071 1727204661.49640: variable 'ansible_distribution_major_version' from source: facts 44071 1727204661.49645: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204661.49699: variable 'item' from source: unknown 44071 1727204661.49734: variable 'item' from source: unknown 44071 1727204661.49791: variable 'item' from source: unknown 44071 1727204661.49858: variable 'item' from source: unknown 44071 1727204661.50331: dumping result to json 44071 1727204661.50334: done dumping result, returning 44071 1727204661.50337: done running TaskExecutor() for managed-node2/TASK: Test [127b8e07-fff9-c964-7471-000000001009] 44071 1727204661.50339: sending task result for task 127b8e07-fff9-c964-7471-000000001009 44071 1727204661.50417: no more pending results, returning what we have 44071 1727204661.50422: in VariableManager get_vars() 44071 1727204661.50459: Calling all_inventory to load vars for managed-node2 44071 1727204661.50467: Calling groups_inventory to load vars for managed-node2 44071 1727204661.50471: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204661.50485: Calling all_plugins_play to load vars for managed-node2 44071 1727204661.50488: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204661.50491: Calling groups_plugins_play to load vars for managed-node2 44071 1727204661.51235: done sending task result for task 127b8e07-fff9-c964-7471-000000001009 44071 1727204661.51240: WORKER PROCESS EXITING 44071 1727204661.55749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204661.58457: done with get_vars() 44071 1727204661.58499: variable 'ansible_search_path' from source: unknown 44071 1727204661.58500: variable 'ansible_search_path' from source: unknown 44071 1727204661.58553: we have included files to process 44071 1727204661.58554: generating all_blocks data 44071 1727204661.58557: done generating all_blocks data 44071 1727204661.58563: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 44071 1727204661.58567: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 44071 1727204661.58570: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 44071 1727204661.58787: done processing included file 44071 1727204661.58790: iterating over new_blocks loaded from include file 44071 1727204661.58792: in VariableManager get_vars() 44071 1727204661.58811: done with get_vars() 44071 1727204661.58812: filtering new block on tags 44071 1727204661.58849: done filtering new block on tags 44071 1727204661.58853: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed-node2 => (item=tasks/remove_profile.yml) 44071 1727204661.58858: extending task lists for all hosts with included blocks 44071 1727204661.59979: done extending task lists 44071 1727204661.59982: done processing included files 44071 1727204661.59982: results queue empty 44071 1727204661.59983: checking for any_errors_fatal 44071 1727204661.59985: done checking for any_errors_fatal 44071 1727204661.59986: checking for max_fail_percentage 44071 1727204661.59987: done checking for max_fail_percentage 44071 1727204661.59988: checking to see if all hosts have failed and the running result is not ok 44071 1727204661.59989: done checking to see if all hosts have failed 44071 1727204661.59990: getting the remaining hosts for this loop 44071 1727204661.59991: done getting the remaining hosts for this loop 44071 1727204661.59994: getting the next task for host managed-node2 44071 1727204661.59999: done getting next task for host managed-node2 44071 1727204661.60001: ^ task is: TASK: Include network role 44071 1727204661.60005: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204661.60007: getting variables 44071 1727204661.60013: in VariableManager get_vars() 44071 1727204661.60029: Calling all_inventory to load vars for managed-node2 44071 1727204661.60035: Calling groups_inventory to load vars for managed-node2 44071 1727204661.60038: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204661.60045: Calling all_plugins_play to load vars for managed-node2 44071 1727204661.60048: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204661.60051: Calling groups_plugins_play to load vars for managed-node2 44071 1727204661.61409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204661.63355: done with get_vars() 44071 1727204661.63908: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Tuesday 24 September 2024 15:04:21 -0400 (0:00:00.175) 0:01:13.956 ***** 44071 1727204661.64034: entering _queue_task() for managed-node2/include_role 44071 1727204661.65046: worker is 1 (out of 1 available) 44071 1727204661.65061: exiting _queue_task() for managed-node2/include_role 44071 1727204661.65078: done queuing things up, now waiting for results queue to drain 44071 1727204661.65080: waiting for pending results... 44071 1727204661.65488: running TaskExecutor() for managed-node2/TASK: Include network role 44071 1727204661.65755: in run() - task 127b8e07-fff9-c964-7471-0000000013e8 44071 1727204661.65797: variable 'ansible_search_path' from source: unknown 44071 1727204661.65807: variable 'ansible_search_path' from source: unknown 44071 1727204661.65861: calling self._execute() 44071 1727204661.66040: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204661.66056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204661.66100: variable 'omit' from source: magic vars 44071 1727204661.67262: variable 'ansible_distribution_major_version' from source: facts 44071 1727204661.67277: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204661.67370: _execute() done 44071 1727204661.67375: dumping result to json 44071 1727204661.67378: done dumping result, returning 44071 1727204661.67381: done running TaskExecutor() for managed-node2/TASK: Include network role [127b8e07-fff9-c964-7471-0000000013e8] 44071 1727204661.67383: sending task result for task 127b8e07-fff9-c964-7471-0000000013e8 44071 1727204661.67842: no more pending results, returning what we have 44071 1727204661.67849: in VariableManager get_vars() 44071 1727204661.67903: Calling all_inventory to load vars for managed-node2 44071 1727204661.68026: Calling groups_inventory to load vars for managed-node2 44071 1727204661.68034: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204661.68151: Calling all_plugins_play to load vars for managed-node2 44071 1727204661.68156: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204661.68161: Calling groups_plugins_play to load vars for managed-node2 44071 1727204661.68977: done sending task result for task 127b8e07-fff9-c964-7471-0000000013e8 44071 1727204661.68981: WORKER PROCESS EXITING 44071 1727204661.70799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204661.72199: done with get_vars() 44071 1727204661.72238: variable 'ansible_search_path' from source: unknown 44071 1727204661.72239: variable 'ansible_search_path' from source: unknown 44071 1727204661.72717: variable 'omit' from source: magic vars 44071 1727204661.72771: variable 'omit' from source: magic vars 44071 1727204661.72789: variable 'omit' from source: magic vars 44071 1727204661.72793: we have included files to process 44071 1727204661.72794: generating all_blocks data 44071 1727204661.72796: done generating all_blocks data 44071 1727204661.72797: processing included file: fedora.linux_system_roles.network 44071 1727204661.72821: in VariableManager get_vars() 44071 1727204661.72843: done with get_vars() 44071 1727204661.72876: in VariableManager get_vars() 44071 1727204661.72896: done with get_vars() 44071 1727204661.72942: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44071 1727204661.73298: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44071 1727204661.73396: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44071 1727204661.74109: in VariableManager get_vars() 44071 1727204661.74136: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204661.76092: iterating over new_blocks loaded from include file 44071 1727204661.76094: in VariableManager get_vars() 44071 1727204661.76111: done with get_vars() 44071 1727204661.76112: filtering new block on tags 44071 1727204661.76319: done filtering new block on tags 44071 1727204661.76322: in VariableManager get_vars() 44071 1727204661.76336: done with get_vars() 44071 1727204661.76337: filtering new block on tags 44071 1727204661.76350: done filtering new block on tags 44071 1727204661.76351: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 44071 1727204661.76355: extending task lists for all hosts with included blocks 44071 1727204661.76436: done extending task lists 44071 1727204661.76437: done processing included files 44071 1727204661.76437: results queue empty 44071 1727204661.76438: checking for any_errors_fatal 44071 1727204661.76441: done checking for any_errors_fatal 44071 1727204661.76442: checking for max_fail_percentage 44071 1727204661.76442: done checking for max_fail_percentage 44071 1727204661.76443: checking to see if all hosts have failed and the running result is not ok 44071 1727204661.76444: done checking to see if all hosts have failed 44071 1727204661.76444: getting the remaining hosts for this loop 44071 1727204661.76445: done getting the remaining hosts for this loop 44071 1727204661.76447: getting the next task for host managed-node2 44071 1727204661.76450: done getting next task for host managed-node2 44071 1727204661.76452: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204661.76455: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204661.76464: getting variables 44071 1727204661.76464: in VariableManager get_vars() 44071 1727204661.76476: Calling all_inventory to load vars for managed-node2 44071 1727204661.76478: Calling groups_inventory to load vars for managed-node2 44071 1727204661.76479: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204661.76484: Calling all_plugins_play to load vars for managed-node2 44071 1727204661.76486: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204661.76487: Calling groups_plugins_play to load vars for managed-node2 44071 1727204661.78314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204661.80827: done with get_vars() 44071 1727204661.80878: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:04:21 -0400 (0:00:00.170) 0:01:14.126 ***** 44071 1727204661.81044: entering _queue_task() for managed-node2/include_tasks 44071 1727204661.81605: worker is 1 (out of 1 available) 44071 1727204661.81622: exiting _queue_task() for managed-node2/include_tasks 44071 1727204661.81638: done queuing things up, now waiting for results queue to drain 44071 1727204661.81640: waiting for pending results... 44071 1727204661.81995: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204661.82216: in run() - task 127b8e07-fff9-c964-7471-00000000145f 44071 1727204661.82244: variable 'ansible_search_path' from source: unknown 44071 1727204661.82251: variable 'ansible_search_path' from source: unknown 44071 1727204661.82296: calling self._execute() 44071 1727204661.82481: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204661.82486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204661.82488: variable 'omit' from source: magic vars 44071 1727204661.83044: variable 'ansible_distribution_major_version' from source: facts 44071 1727204661.83094: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204661.83098: _execute() done 44071 1727204661.83101: dumping result to json 44071 1727204661.83104: done dumping result, returning 44071 1727204661.83107: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-c964-7471-00000000145f] 44071 1727204661.83109: sending task result for task 127b8e07-fff9-c964-7471-00000000145f 44071 1727204661.83299: done sending task result for task 127b8e07-fff9-c964-7471-00000000145f 44071 1727204661.83303: WORKER PROCESS EXITING 44071 1727204661.83394: no more pending results, returning what we have 44071 1727204661.83400: in VariableManager get_vars() 44071 1727204661.83459: Calling all_inventory to load vars for managed-node2 44071 1727204661.83463: Calling groups_inventory to load vars for managed-node2 44071 1727204661.83467: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204661.83486: Calling all_plugins_play to load vars for managed-node2 44071 1727204661.83490: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204661.83496: Calling groups_plugins_play to load vars for managed-node2 44071 1727204661.86944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204661.89656: done with get_vars() 44071 1727204661.89701: variable 'ansible_search_path' from source: unknown 44071 1727204661.89703: variable 'ansible_search_path' from source: unknown 44071 1727204661.89756: we have included files to process 44071 1727204661.89757: generating all_blocks data 44071 1727204661.89759: done generating all_blocks data 44071 1727204661.89763: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204661.89767: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204661.89770: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204661.91169: done processing included file 44071 1727204661.91172: iterating over new_blocks loaded from include file 44071 1727204661.91174: in VariableManager get_vars() 44071 1727204661.91217: done with get_vars() 44071 1727204661.91220: filtering new block on tags 44071 1727204661.91295: done filtering new block on tags 44071 1727204661.91299: in VariableManager get_vars() 44071 1727204661.91393: done with get_vars() 44071 1727204661.91395: filtering new block on tags 44071 1727204661.91570: done filtering new block on tags 44071 1727204661.91575: in VariableManager get_vars() 44071 1727204661.91610: done with get_vars() 44071 1727204661.91612: filtering new block on tags 44071 1727204661.91699: done filtering new block on tags 44071 1727204661.91702: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 44071 1727204661.91708: extending task lists for all hosts with included blocks 44071 1727204661.94072: done extending task lists 44071 1727204661.94073: done processing included files 44071 1727204661.94074: results queue empty 44071 1727204661.94074: checking for any_errors_fatal 44071 1727204661.94078: done checking for any_errors_fatal 44071 1727204661.94078: checking for max_fail_percentage 44071 1727204661.94079: done checking for max_fail_percentage 44071 1727204661.94080: checking to see if all hosts have failed and the running result is not ok 44071 1727204661.94080: done checking to see if all hosts have failed 44071 1727204661.94081: getting the remaining hosts for this loop 44071 1727204661.94082: done getting the remaining hosts for this loop 44071 1727204661.94084: getting the next task for host managed-node2 44071 1727204661.94088: done getting next task for host managed-node2 44071 1727204661.94090: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204661.94093: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204661.94103: getting variables 44071 1727204661.94103: in VariableManager get_vars() 44071 1727204661.94117: Calling all_inventory to load vars for managed-node2 44071 1727204661.94119: Calling groups_inventory to load vars for managed-node2 44071 1727204661.94120: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204661.94125: Calling all_plugins_play to load vars for managed-node2 44071 1727204661.94126: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204661.94128: Calling groups_plugins_play to load vars for managed-node2 44071 1727204662.01425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204662.03497: done with get_vars() 44071 1727204662.03539: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:04:22 -0400 (0:00:00.225) 0:01:14.352 ***** 44071 1727204662.03643: entering _queue_task() for managed-node2/setup 44071 1727204662.04290: worker is 1 (out of 1 available) 44071 1727204662.04302: exiting _queue_task() for managed-node2/setup 44071 1727204662.04315: done queuing things up, now waiting for results queue to drain 44071 1727204662.04317: waiting for pending results... 44071 1727204662.04948: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204662.04955: in run() - task 127b8e07-fff9-c964-7471-0000000014b6 44071 1727204662.04960: variable 'ansible_search_path' from source: unknown 44071 1727204662.04963: variable 'ansible_search_path' from source: unknown 44071 1727204662.05016: calling self._execute() 44071 1727204662.05021: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204662.05024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204662.05027: variable 'omit' from source: magic vars 44071 1727204662.05384: variable 'ansible_distribution_major_version' from source: facts 44071 1727204662.05388: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204662.05613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204662.08292: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204662.08297: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204662.08300: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204662.08303: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204662.08318: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204662.08408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204662.08438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204662.08470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204662.08516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204662.08529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204662.08727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204662.08734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204662.08737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204662.08740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204662.08742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204662.08873: variable '__network_required_facts' from source: role '' defaults 44071 1727204662.08890: variable 'ansible_facts' from source: unknown 44071 1727204662.09961: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44071 1727204662.09967: when evaluation is False, skipping this task 44071 1727204662.09971: _execute() done 44071 1727204662.09974: dumping result to json 44071 1727204662.09978: done dumping result, returning 44071 1727204662.09986: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-c964-7471-0000000014b6] 44071 1727204662.10006: sending task result for task 127b8e07-fff9-c964-7471-0000000014b6 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204662.10351: no more pending results, returning what we have 44071 1727204662.10355: results queue empty 44071 1727204662.10356: checking for any_errors_fatal 44071 1727204662.10360: done checking for any_errors_fatal 44071 1727204662.10361: checking for max_fail_percentage 44071 1727204662.10362: done checking for max_fail_percentage 44071 1727204662.10363: checking to see if all hosts have failed and the running result is not ok 44071 1727204662.10364: done checking to see if all hosts have failed 44071 1727204662.10365: getting the remaining hosts for this loop 44071 1727204662.10368: done getting the remaining hosts for this loop 44071 1727204662.10373: getting the next task for host managed-node2 44071 1727204662.10388: done getting next task for host managed-node2 44071 1727204662.10393: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204662.10400: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204662.10412: done sending task result for task 127b8e07-fff9-c964-7471-0000000014b6 44071 1727204662.10417: WORKER PROCESS EXITING 44071 1727204662.10483: getting variables 44071 1727204662.10485: in VariableManager get_vars() 44071 1727204662.10525: Calling all_inventory to load vars for managed-node2 44071 1727204662.10529: Calling groups_inventory to load vars for managed-node2 44071 1727204662.10531: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204662.10541: Calling all_plugins_play to load vars for managed-node2 44071 1727204662.10544: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204662.10554: Calling groups_plugins_play to load vars for managed-node2 44071 1727204662.12483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204662.14767: done with get_vars() 44071 1727204662.14805: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:04:22 -0400 (0:00:00.112) 0:01:14.465 ***** 44071 1727204662.14933: entering _queue_task() for managed-node2/stat 44071 1727204662.15568: worker is 1 (out of 1 available) 44071 1727204662.15582: exiting _queue_task() for managed-node2/stat 44071 1727204662.15595: done queuing things up, now waiting for results queue to drain 44071 1727204662.15596: waiting for pending results... 44071 1727204662.15824: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204662.16116: in run() - task 127b8e07-fff9-c964-7471-0000000014b8 44071 1727204662.16121: variable 'ansible_search_path' from source: unknown 44071 1727204662.16124: variable 'ansible_search_path' from source: unknown 44071 1727204662.16132: calling self._execute() 44071 1727204662.16186: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204662.16190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204662.16194: variable 'omit' from source: magic vars 44071 1727204662.16671: variable 'ansible_distribution_major_version' from source: facts 44071 1727204662.16691: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204662.16911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204662.17306: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204662.17310: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204662.17887: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204662.17942: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204662.18057: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204662.18089: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204662.18116: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204662.18149: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204662.18271: variable '__network_is_ostree' from source: set_fact 44071 1727204662.18286: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204662.18290: when evaluation is False, skipping this task 44071 1727204662.18293: _execute() done 44071 1727204662.18300: dumping result to json 44071 1727204662.18303: done dumping result, returning 44071 1727204662.18307: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-c964-7471-0000000014b8] 44071 1727204662.18309: sending task result for task 127b8e07-fff9-c964-7471-0000000014b8 44071 1727204662.18656: done sending task result for task 127b8e07-fff9-c964-7471-0000000014b8 44071 1727204662.18664: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204662.18726: no more pending results, returning what we have 44071 1727204662.18730: results queue empty 44071 1727204662.18731: checking for any_errors_fatal 44071 1727204662.18739: done checking for any_errors_fatal 44071 1727204662.18739: checking for max_fail_percentage 44071 1727204662.18741: done checking for max_fail_percentage 44071 1727204662.18742: checking to see if all hosts have failed and the running result is not ok 44071 1727204662.18743: done checking to see if all hosts have failed 44071 1727204662.18744: getting the remaining hosts for this loop 44071 1727204662.18745: done getting the remaining hosts for this loop 44071 1727204662.18749: getting the next task for host managed-node2 44071 1727204662.18757: done getting next task for host managed-node2 44071 1727204662.18762: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204662.18771: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204662.18793: getting variables 44071 1727204662.18795: in VariableManager get_vars() 44071 1727204662.18842: Calling all_inventory to load vars for managed-node2 44071 1727204662.18849: Calling groups_inventory to load vars for managed-node2 44071 1727204662.18852: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204662.18864: Calling all_plugins_play to load vars for managed-node2 44071 1727204662.18944: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204662.18949: Calling groups_plugins_play to load vars for managed-node2 44071 1727204662.21136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204662.23391: done with get_vars() 44071 1727204662.23441: done getting variables 44071 1727204662.23502: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:04:22 -0400 (0:00:00.086) 0:01:14.551 ***** 44071 1727204662.23548: entering _queue_task() for managed-node2/set_fact 44071 1727204662.23855: worker is 1 (out of 1 available) 44071 1727204662.23870: exiting _queue_task() for managed-node2/set_fact 44071 1727204662.23885: done queuing things up, now waiting for results queue to drain 44071 1727204662.23887: waiting for pending results... 44071 1727204662.24101: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204662.24240: in run() - task 127b8e07-fff9-c964-7471-0000000014b9 44071 1727204662.24253: variable 'ansible_search_path' from source: unknown 44071 1727204662.24257: variable 'ansible_search_path' from source: unknown 44071 1727204662.24294: calling self._execute() 44071 1727204662.24383: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204662.24389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204662.24401: variable 'omit' from source: magic vars 44071 1727204662.24735: variable 'ansible_distribution_major_version' from source: facts 44071 1727204662.24749: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204662.24891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204662.25116: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204662.25155: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204662.25225: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204662.25258: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204662.25333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204662.25355: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204662.25409: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204662.25420: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204662.25599: variable '__network_is_ostree' from source: set_fact 44071 1727204662.25603: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204662.25606: when evaluation is False, skipping this task 44071 1727204662.25609: _execute() done 44071 1727204662.25611: dumping result to json 44071 1727204662.25615: done dumping result, returning 44071 1727204662.25618: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-0000000014b9] 44071 1727204662.25621: sending task result for task 127b8e07-fff9-c964-7471-0000000014b9 44071 1727204662.25705: done sending task result for task 127b8e07-fff9-c964-7471-0000000014b9 44071 1727204662.25708: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204662.25775: no more pending results, returning what we have 44071 1727204662.25779: results queue empty 44071 1727204662.25780: checking for any_errors_fatal 44071 1727204662.25785: done checking for any_errors_fatal 44071 1727204662.25786: checking for max_fail_percentage 44071 1727204662.25787: done checking for max_fail_percentage 44071 1727204662.25788: checking to see if all hosts have failed and the running result is not ok 44071 1727204662.25789: done checking to see if all hosts have failed 44071 1727204662.25790: getting the remaining hosts for this loop 44071 1727204662.25792: done getting the remaining hosts for this loop 44071 1727204662.25796: getting the next task for host managed-node2 44071 1727204662.25806: done getting next task for host managed-node2 44071 1727204662.25810: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204662.25816: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204662.25841: getting variables 44071 1727204662.25843: in VariableManager get_vars() 44071 1727204662.25883: Calling all_inventory to load vars for managed-node2 44071 1727204662.25886: Calling groups_inventory to load vars for managed-node2 44071 1727204662.25888: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204662.25897: Calling all_plugins_play to load vars for managed-node2 44071 1727204662.25900: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204662.25902: Calling groups_plugins_play to load vars for managed-node2 44071 1727204662.28176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204662.31400: done with get_vars() 44071 1727204662.31457: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:04:22 -0400 (0:00:00.080) 0:01:14.632 ***** 44071 1727204662.31591: entering _queue_task() for managed-node2/service_facts 44071 1727204662.32325: worker is 1 (out of 1 available) 44071 1727204662.32339: exiting _queue_task() for managed-node2/service_facts 44071 1727204662.32356: done queuing things up, now waiting for results queue to drain 44071 1727204662.32358: waiting for pending results... 44071 1727204662.32543: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204662.32703: in run() - task 127b8e07-fff9-c964-7471-0000000014bb 44071 1727204662.32716: variable 'ansible_search_path' from source: unknown 44071 1727204662.32719: variable 'ansible_search_path' from source: unknown 44071 1727204662.32757: calling self._execute() 44071 1727204662.32845: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204662.32849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204662.32860: variable 'omit' from source: magic vars 44071 1727204662.33191: variable 'ansible_distribution_major_version' from source: facts 44071 1727204662.33204: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204662.33211: variable 'omit' from source: magic vars 44071 1727204662.33275: variable 'omit' from source: magic vars 44071 1727204662.33304: variable 'omit' from source: magic vars 44071 1727204662.33341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204662.33372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204662.33391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204662.33409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204662.33420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204662.33445: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204662.33449: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204662.33454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204662.33535: Set connection var ansible_connection to ssh 44071 1727204662.33539: Set connection var ansible_timeout to 10 44071 1727204662.33544: Set connection var ansible_pipelining to False 44071 1727204662.33585: Set connection var ansible_shell_type to sh 44071 1727204662.33589: Set connection var ansible_shell_executable to /bin/sh 44071 1727204662.33591: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204662.33596: variable 'ansible_shell_executable' from source: unknown 44071 1727204662.33599: variable 'ansible_connection' from source: unknown 44071 1727204662.33602: variable 'ansible_module_compression' from source: unknown 44071 1727204662.33605: variable 'ansible_shell_type' from source: unknown 44071 1727204662.33608: variable 'ansible_shell_executable' from source: unknown 44071 1727204662.33611: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204662.33616: variable 'ansible_pipelining' from source: unknown 44071 1727204662.33618: variable 'ansible_timeout' from source: unknown 44071 1727204662.33627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204662.33951: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204662.33956: variable 'omit' from source: magic vars 44071 1727204662.33959: starting attempt loop 44071 1727204662.33961: running the handler 44071 1727204662.33964: _low_level_execute_command(): starting 44071 1727204662.33968: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204662.34722: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204662.34798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204662.34872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204662.34982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204662.36719: stdout chunk (state=3): >>>/root <<< 44071 1727204662.36828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204662.36897: stderr chunk (state=3): >>><<< 44071 1727204662.36901: stdout chunk (state=3): >>><<< 44071 1727204662.36917: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204662.36931: _low_level_execute_command(): starting 44071 1727204662.36946: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204662.3691864-48115-271896802023835 `" && echo ansible-tmp-1727204662.3691864-48115-271896802023835="` echo /root/.ansible/tmp/ansible-tmp-1727204662.3691864-48115-271896802023835 `" ) && sleep 0' 44071 1727204662.37454: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204662.37458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204662.37461: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204662.37476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204662.37517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204662.37521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204662.37528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204662.37604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204662.39594: stdout chunk (state=3): >>>ansible-tmp-1727204662.3691864-48115-271896802023835=/root/.ansible/tmp/ansible-tmp-1727204662.3691864-48115-271896802023835 <<< 44071 1727204662.39700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204662.39768: stderr chunk (state=3): >>><<< 44071 1727204662.39772: stdout chunk (state=3): >>><<< 44071 1727204662.39790: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204662.3691864-48115-271896802023835=/root/.ansible/tmp/ansible-tmp-1727204662.3691864-48115-271896802023835 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204662.39837: variable 'ansible_module_compression' from source: unknown 44071 1727204662.39889: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44071 1727204662.39922: variable 'ansible_facts' from source: unknown 44071 1727204662.39983: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204662.3691864-48115-271896802023835/AnsiballZ_service_facts.py 44071 1727204662.40102: Sending initial data 44071 1727204662.40106: Sent initial data (162 bytes) 44071 1727204662.40619: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204662.40624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204662.40626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204662.40629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204662.40684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204662.40688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204662.40764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204662.42375: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204662.42475: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204662.42550: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpw_sjyqs9 /root/.ansible/tmp/ansible-tmp-1727204662.3691864-48115-271896802023835/AnsiballZ_service_facts.py <<< 44071 1727204662.42554: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204662.3691864-48115-271896802023835/AnsiballZ_service_facts.py" <<< 44071 1727204662.42618: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpw_sjyqs9" to remote "/root/.ansible/tmp/ansible-tmp-1727204662.3691864-48115-271896802023835/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204662.3691864-48115-271896802023835/AnsiballZ_service_facts.py" <<< 44071 1727204662.43305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204662.43379: stderr chunk (state=3): >>><<< 44071 1727204662.43383: stdout chunk (state=3): >>><<< 44071 1727204662.43405: done transferring module to remote 44071 1727204662.43415: _low_level_execute_command(): starting 44071 1727204662.43420: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204662.3691864-48115-271896802023835/ /root/.ansible/tmp/ansible-tmp-1727204662.3691864-48115-271896802023835/AnsiballZ_service_facts.py && sleep 0' 44071 1727204662.43924: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204662.43929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204662.43932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204662.43938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204662.43990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204662.44000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204662.44002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204662.44075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204662.45899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204662.45956: stderr chunk (state=3): >>><<< 44071 1727204662.45960: stdout chunk (state=3): >>><<< 44071 1727204662.45975: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204662.45978: _low_level_execute_command(): starting 44071 1727204662.45984: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204662.3691864-48115-271896802023835/AnsiballZ_service_facts.py && sleep 0' 44071 1727204662.46878: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204662.46995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204662.47038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204662.47042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204662.47045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204662.47157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204664.73758: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev<<< 44071 1727204664.73782: stdout chunk (state=3): >>>-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44071 1727204664.75393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204664.75428: stderr chunk (state=3): >>><<< 44071 1727204664.75431: stdout chunk (state=3): >>><<< 44071 1727204664.75476: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204664.76072: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204662.3691864-48115-271896802023835/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204664.76079: _low_level_execute_command(): starting 44071 1727204664.76087: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204662.3691864-48115-271896802023835/ > /dev/null 2>&1 && sleep 0' 44071 1727204664.76638: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204664.76642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204664.76739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204664.76742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204664.76815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204664.78871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204664.78876: stdout chunk (state=3): >>><<< 44071 1727204664.78878: stderr chunk (state=3): >>><<< 44071 1727204664.78905: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204664.78912: handler run complete 44071 1727204664.79194: variable 'ansible_facts' from source: unknown 44071 1727204664.79519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204664.80448: variable 'ansible_facts' from source: unknown 44071 1727204664.80967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204664.81812: attempt loop complete, returning result 44071 1727204664.81832: _execute() done 44071 1727204664.81841: dumping result to json 44071 1727204664.81970: done dumping result, returning 44071 1727204664.82072: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-c964-7471-0000000014bb] 44071 1727204664.82179: sending task result for task 127b8e07-fff9-c964-7471-0000000014bb 44071 1727204664.85246: done sending task result for task 127b8e07-fff9-c964-7471-0000000014bb 44071 1727204664.85250: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204664.85383: no more pending results, returning what we have 44071 1727204664.85386: results queue empty 44071 1727204664.85387: checking for any_errors_fatal 44071 1727204664.85393: done checking for any_errors_fatal 44071 1727204664.85394: checking for max_fail_percentage 44071 1727204664.85395: done checking for max_fail_percentage 44071 1727204664.85396: checking to see if all hosts have failed and the running result is not ok 44071 1727204664.85397: done checking to see if all hosts have failed 44071 1727204664.85398: getting the remaining hosts for this loop 44071 1727204664.85399: done getting the remaining hosts for this loop 44071 1727204664.85403: getting the next task for host managed-node2 44071 1727204664.85410: done getting next task for host managed-node2 44071 1727204664.85414: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204664.85421: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204664.85434: getting variables 44071 1727204664.85436: in VariableManager get_vars() 44071 1727204664.85879: Calling all_inventory to load vars for managed-node2 44071 1727204664.85884: Calling groups_inventory to load vars for managed-node2 44071 1727204664.85886: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204664.85898: Calling all_plugins_play to load vars for managed-node2 44071 1727204664.85902: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204664.85905: Calling groups_plugins_play to load vars for managed-node2 44071 1727204664.90402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204664.95214: done with get_vars() 44071 1727204664.95262: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:04:24 -0400 (0:00:02.641) 0:01:17.273 ***** 44071 1727204664.95706: entering _queue_task() for managed-node2/package_facts 44071 1727204664.96513: worker is 1 (out of 1 available) 44071 1727204664.96528: exiting _queue_task() for managed-node2/package_facts 44071 1727204664.96542: done queuing things up, now waiting for results queue to drain 44071 1727204664.96544: waiting for pending results... 44071 1727204664.97263: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204664.97468: in run() - task 127b8e07-fff9-c964-7471-0000000014bc 44071 1727204664.97497: variable 'ansible_search_path' from source: unknown 44071 1727204664.97506: variable 'ansible_search_path' from source: unknown 44071 1727204664.97555: calling self._execute() 44071 1727204664.97878: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204664.97961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204664.98049: variable 'omit' from source: magic vars 44071 1727204664.99813: variable 'ansible_distribution_major_version' from source: facts 44071 1727204664.99818: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204664.99821: variable 'omit' from source: magic vars 44071 1727204665.00052: variable 'omit' from source: magic vars 44071 1727204665.00372: variable 'omit' from source: magic vars 44071 1727204665.00522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204665.00692: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204665.00799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204665.00919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204665.01035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204665.01226: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204665.01544: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204665.01548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204665.01714: Set connection var ansible_connection to ssh 44071 1727204665.01740: Set connection var ansible_timeout to 10 44071 1727204665.01842: Set connection var ansible_pipelining to False 44071 1727204665.01855: Set connection var ansible_shell_type to sh 44071 1727204665.01907: Set connection var ansible_shell_executable to /bin/sh 44071 1727204665.01952: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204665.02060: variable 'ansible_shell_executable' from source: unknown 44071 1727204665.02104: variable 'ansible_connection' from source: unknown 44071 1727204665.02220: variable 'ansible_module_compression' from source: unknown 44071 1727204665.02224: variable 'ansible_shell_type' from source: unknown 44071 1727204665.02226: variable 'ansible_shell_executable' from source: unknown 44071 1727204665.02229: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204665.02234: variable 'ansible_pipelining' from source: unknown 44071 1727204665.02237: variable 'ansible_timeout' from source: unknown 44071 1727204665.02239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204665.02913: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204665.02943: variable 'omit' from source: magic vars 44071 1727204665.02956: starting attempt loop 44071 1727204665.02983: running the handler 44071 1727204665.03148: _low_level_execute_command(): starting 44071 1727204665.03152: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204665.04793: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204665.05224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204665.05317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204665.07108: stdout chunk (state=3): >>>/root <<< 44071 1727204665.07492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204665.07525: stdout chunk (state=3): >>><<< 44071 1727204665.07534: stderr chunk (state=3): >>><<< 44071 1727204665.07538: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204665.07546: _low_level_execute_command(): starting 44071 1727204665.07550: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204665.073512-48374-270678178582591 `" && echo ansible-tmp-1727204665.073512-48374-270678178582591="` echo /root/.ansible/tmp/ansible-tmp-1727204665.073512-48374-270678178582591 `" ) && sleep 0' 44071 1727204665.09680: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204665.09770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204665.09843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204665.09986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204665.10005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204665.10112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204665.12256: stdout chunk (state=3): >>>ansible-tmp-1727204665.073512-48374-270678178582591=/root/.ansible/tmp/ansible-tmp-1727204665.073512-48374-270678178582591 <<< 44071 1727204665.12416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204665.12612: stderr chunk (state=3): >>><<< 44071 1727204665.12626: stdout chunk (state=3): >>><<< 44071 1727204665.12653: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204665.073512-48374-270678178582591=/root/.ansible/tmp/ansible-tmp-1727204665.073512-48374-270678178582591 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204665.12975: variable 'ansible_module_compression' from source: unknown 44071 1727204665.12980: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44071 1727204665.13204: variable 'ansible_facts' from source: unknown 44071 1727204665.13652: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204665.073512-48374-270678178582591/AnsiballZ_package_facts.py 44071 1727204665.14312: Sending initial data 44071 1727204665.14316: Sent initial data (161 bytes) 44071 1727204665.15747: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204665.15840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204665.15918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204665.15983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204665.16039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204665.16146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204665.16206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204665.17996: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204665.18129: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204665.18176: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpxo04cszw /root/.ansible/tmp/ansible-tmp-1727204665.073512-48374-270678178582591/AnsiballZ_package_facts.py <<< 44071 1727204665.18180: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204665.073512-48374-270678178582591/AnsiballZ_package_facts.py" <<< 44071 1727204665.18256: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpxo04cszw" to remote "/root/.ansible/tmp/ansible-tmp-1727204665.073512-48374-270678178582591/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204665.073512-48374-270678178582591/AnsiballZ_package_facts.py" <<< 44071 1727204665.21790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204665.22060: stderr chunk (state=3): >>><<< 44071 1727204665.22067: stdout chunk (state=3): >>><<< 44071 1727204665.22070: done transferring module to remote 44071 1727204665.22076: _low_level_execute_command(): starting 44071 1727204665.22078: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204665.073512-48374-270678178582591/ /root/.ansible/tmp/ansible-tmp-1727204665.073512-48374-270678178582591/AnsiballZ_package_facts.py && sleep 0' 44071 1727204665.23811: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204665.24008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204665.24014: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204665.24196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204665.24211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204665.24315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204665.26386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204665.26390: stdout chunk (state=3): >>><<< 44071 1727204665.26393: stderr chunk (state=3): >>><<< 44071 1727204665.26413: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204665.26438: _low_level_execute_command(): starting 44071 1727204665.26481: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204665.073512-48374-270678178582591/AnsiballZ_package_facts.py && sleep 0' 44071 1727204665.28295: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204665.28494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204665.28588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204665.28701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204665.92377: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 44071 1727204665.92398: stdout chunk (state=3): >>>me": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40",<<< 44071 1727204665.92474: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "li<<< 44071 1727204665.92487: stdout chunk (state=3): >>>breport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarc<<< 44071 1727204665.92580: stdout chunk (state=3): >>>h", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 44071 1727204665.92687: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44071 1727204665.94439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204665.94624: stderr chunk (state=3): >>><<< 44071 1727204665.94775: stdout chunk (state=3): >>><<< 44071 1727204665.94905: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204666.02192: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204665.073512-48374-270678178582591/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204666.02486: _low_level_execute_command(): starting 44071 1727204666.02490: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204665.073512-48374-270678178582591/ > /dev/null 2>&1 && sleep 0' 44071 1727204666.03721: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204666.03824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204666.03901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204666.03929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204666.04090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204666.04197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204666.06282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204666.06296: stdout chunk (state=3): >>><<< 44071 1727204666.06320: stderr chunk (state=3): >>><<< 44071 1727204666.06527: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204666.06531: handler run complete 44071 1727204666.08763: variable 'ansible_facts' from source: unknown 44071 1727204666.10278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204666.16351: variable 'ansible_facts' from source: unknown 44071 1727204666.17761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204666.19760: attempt loop complete, returning result 44071 1727204666.19852: _execute() done 44071 1727204666.19938: dumping result to json 44071 1727204666.20701: done dumping result, returning 44071 1727204666.20706: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-c964-7471-0000000014bc] 44071 1727204666.20708: sending task result for task 127b8e07-fff9-c964-7471-0000000014bc 44071 1727204666.28138: done sending task result for task 127b8e07-fff9-c964-7471-0000000014bc 44071 1727204666.28143: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204666.28318: no more pending results, returning what we have 44071 1727204666.28322: results queue empty 44071 1727204666.28323: checking for any_errors_fatal 44071 1727204666.28330: done checking for any_errors_fatal 44071 1727204666.28331: checking for max_fail_percentage 44071 1727204666.28335: done checking for max_fail_percentage 44071 1727204666.28336: checking to see if all hosts have failed and the running result is not ok 44071 1727204666.28336: done checking to see if all hosts have failed 44071 1727204666.28337: getting the remaining hosts for this loop 44071 1727204666.28339: done getting the remaining hosts for this loop 44071 1727204666.28343: getting the next task for host managed-node2 44071 1727204666.28351: done getting next task for host managed-node2 44071 1727204666.28355: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204666.28361: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204666.28376: getting variables 44071 1727204666.28378: in VariableManager get_vars() 44071 1727204666.28412: Calling all_inventory to load vars for managed-node2 44071 1727204666.28416: Calling groups_inventory to load vars for managed-node2 44071 1727204666.28418: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204666.28428: Calling all_plugins_play to load vars for managed-node2 44071 1727204666.28434: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204666.28438: Calling groups_plugins_play to load vars for managed-node2 44071 1727204666.32125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204666.34926: done with get_vars() 44071 1727204666.34980: done getting variables 44071 1727204666.35061: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:04:26 -0400 (0:00:01.394) 0:01:18.667 ***** 44071 1727204666.35120: entering _queue_task() for managed-node2/debug 44071 1727204666.35901: worker is 1 (out of 1 available) 44071 1727204666.35916: exiting _queue_task() for managed-node2/debug 44071 1727204666.35932: done queuing things up, now waiting for results queue to drain 44071 1727204666.35934: waiting for pending results... 44071 1727204666.36492: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204666.36498: in run() - task 127b8e07-fff9-c964-7471-000000001460 44071 1727204666.36502: variable 'ansible_search_path' from source: unknown 44071 1727204666.36506: variable 'ansible_search_path' from source: unknown 44071 1727204666.36573: calling self._execute() 44071 1727204666.36655: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204666.36772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204666.36777: variable 'omit' from source: magic vars 44071 1727204666.37178: variable 'ansible_distribution_major_version' from source: facts 44071 1727204666.37192: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204666.37198: variable 'omit' from source: magic vars 44071 1727204666.37270: variable 'omit' from source: magic vars 44071 1727204666.37390: variable 'network_provider' from source: set_fact 44071 1727204666.37770: variable 'omit' from source: magic vars 44071 1727204666.37774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204666.37777: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204666.37780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204666.37782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204666.37785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204666.37787: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204666.37790: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204666.37792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204666.37794: Set connection var ansible_connection to ssh 44071 1727204666.37797: Set connection var ansible_timeout to 10 44071 1727204666.37799: Set connection var ansible_pipelining to False 44071 1727204666.37801: Set connection var ansible_shell_type to sh 44071 1727204666.37804: Set connection var ansible_shell_executable to /bin/sh 44071 1727204666.37806: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204666.37809: variable 'ansible_shell_executable' from source: unknown 44071 1727204666.37811: variable 'ansible_connection' from source: unknown 44071 1727204666.37814: variable 'ansible_module_compression' from source: unknown 44071 1727204666.37816: variable 'ansible_shell_type' from source: unknown 44071 1727204666.37819: variable 'ansible_shell_executable' from source: unknown 44071 1727204666.37821: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204666.37823: variable 'ansible_pipelining' from source: unknown 44071 1727204666.37825: variable 'ansible_timeout' from source: unknown 44071 1727204666.37941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204666.38176: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204666.38180: variable 'omit' from source: magic vars 44071 1727204666.38183: starting attempt loop 44071 1727204666.38185: running the handler 44071 1727204666.38188: handler run complete 44071 1727204666.38372: attempt loop complete, returning result 44071 1727204666.38375: _execute() done 44071 1727204666.38378: dumping result to json 44071 1727204666.38381: done dumping result, returning 44071 1727204666.38383: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-c964-7471-000000001460] 44071 1727204666.38386: sending task result for task 127b8e07-fff9-c964-7471-000000001460 44071 1727204666.38462: done sending task result for task 127b8e07-fff9-c964-7471-000000001460 44071 1727204666.38467: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 44071 1727204666.38555: no more pending results, returning what we have 44071 1727204666.38559: results queue empty 44071 1727204666.38560: checking for any_errors_fatal 44071 1727204666.38572: done checking for any_errors_fatal 44071 1727204666.38573: checking for max_fail_percentage 44071 1727204666.38574: done checking for max_fail_percentage 44071 1727204666.38575: checking to see if all hosts have failed and the running result is not ok 44071 1727204666.38576: done checking to see if all hosts have failed 44071 1727204666.38577: getting the remaining hosts for this loop 44071 1727204666.38579: done getting the remaining hosts for this loop 44071 1727204666.38584: getting the next task for host managed-node2 44071 1727204666.38592: done getting next task for host managed-node2 44071 1727204666.38597: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204666.38603: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204666.38616: getting variables 44071 1727204666.38618: in VariableManager get_vars() 44071 1727204666.38663: Calling all_inventory to load vars for managed-node2 44071 1727204666.38869: Calling groups_inventory to load vars for managed-node2 44071 1727204666.38873: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204666.38884: Calling all_plugins_play to load vars for managed-node2 44071 1727204666.38896: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204666.38900: Calling groups_plugins_play to load vars for managed-node2 44071 1727204666.41514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204666.44412: done with get_vars() 44071 1727204666.44467: done getting variables 44071 1727204666.44539: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:04:26 -0400 (0:00:00.094) 0:01:18.762 ***** 44071 1727204666.44595: entering _queue_task() for managed-node2/fail 44071 1727204666.45426: worker is 1 (out of 1 available) 44071 1727204666.45440: exiting _queue_task() for managed-node2/fail 44071 1727204666.45458: done queuing things up, now waiting for results queue to drain 44071 1727204666.45460: waiting for pending results... 44071 1727204666.46295: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204666.46749: in run() - task 127b8e07-fff9-c964-7471-000000001461 44071 1727204666.46773: variable 'ansible_search_path' from source: unknown 44071 1727204666.46778: variable 'ansible_search_path' from source: unknown 44071 1727204666.47024: calling self._execute() 44071 1727204666.47390: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204666.47395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204666.47677: variable 'omit' from source: magic vars 44071 1727204666.48333: variable 'ansible_distribution_major_version' from source: facts 44071 1727204666.48344: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204666.48526: variable 'network_state' from source: role '' defaults 44071 1727204666.48538: Evaluated conditional (network_state != {}): False 44071 1727204666.48542: when evaluation is False, skipping this task 44071 1727204666.48545: _execute() done 44071 1727204666.48547: dumping result to json 44071 1727204666.48550: done dumping result, returning 44071 1727204666.48687: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-c964-7471-000000001461] 44071 1727204666.48691: sending task result for task 127b8e07-fff9-c964-7471-000000001461 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204666.48856: no more pending results, returning what we have 44071 1727204666.48862: results queue empty 44071 1727204666.48863: checking for any_errors_fatal 44071 1727204666.48875: done checking for any_errors_fatal 44071 1727204666.48876: checking for max_fail_percentage 44071 1727204666.48877: done checking for max_fail_percentage 44071 1727204666.48879: checking to see if all hosts have failed and the running result is not ok 44071 1727204666.48879: done checking to see if all hosts have failed 44071 1727204666.48880: getting the remaining hosts for this loop 44071 1727204666.48882: done getting the remaining hosts for this loop 44071 1727204666.48890: getting the next task for host managed-node2 44071 1727204666.48901: done getting next task for host managed-node2 44071 1727204666.48909: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204666.48919: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204666.49216: done sending task result for task 127b8e07-fff9-c964-7471-000000001461 44071 1727204666.49222: WORKER PROCESS EXITING 44071 1727204666.49241: getting variables 44071 1727204666.49243: in VariableManager get_vars() 44071 1727204666.49299: Calling all_inventory to load vars for managed-node2 44071 1727204666.49302: Calling groups_inventory to load vars for managed-node2 44071 1727204666.49304: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204666.49316: Calling all_plugins_play to load vars for managed-node2 44071 1727204666.49318: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204666.49321: Calling groups_plugins_play to load vars for managed-node2 44071 1727204666.51645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204666.56544: done with get_vars() 44071 1727204666.56594: done getting variables 44071 1727204666.56815: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:04:26 -0400 (0:00:00.122) 0:01:18.886 ***** 44071 1727204666.56984: entering _queue_task() for managed-node2/fail 44071 1727204666.57933: worker is 1 (out of 1 available) 44071 1727204666.57950: exiting _queue_task() for managed-node2/fail 44071 1727204666.57963: done queuing things up, now waiting for results queue to drain 44071 1727204666.57964: waiting for pending results... 44071 1727204666.58590: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204666.58945: in run() - task 127b8e07-fff9-c964-7471-000000001462 44071 1727204666.58950: variable 'ansible_search_path' from source: unknown 44071 1727204666.58953: variable 'ansible_search_path' from source: unknown 44071 1727204666.59094: calling self._execute() 44071 1727204666.59319: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204666.59372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204666.59376: variable 'omit' from source: magic vars 44071 1727204666.60675: variable 'ansible_distribution_major_version' from source: facts 44071 1727204666.60681: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204666.61084: variable 'network_state' from source: role '' defaults 44071 1727204666.61149: Evaluated conditional (network_state != {}): False 44071 1727204666.61172: when evaluation is False, skipping this task 44071 1727204666.61175: _execute() done 44071 1727204666.61178: dumping result to json 44071 1727204666.61180: done dumping result, returning 44071 1727204666.61182: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-c964-7471-000000001462] 44071 1727204666.61185: sending task result for task 127b8e07-fff9-c964-7471-000000001462 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204666.61558: no more pending results, returning what we have 44071 1727204666.61563: results queue empty 44071 1727204666.61567: checking for any_errors_fatal 44071 1727204666.61579: done checking for any_errors_fatal 44071 1727204666.61581: checking for max_fail_percentage 44071 1727204666.61582: done checking for max_fail_percentage 44071 1727204666.61584: checking to see if all hosts have failed and the running result is not ok 44071 1727204666.61584: done checking to see if all hosts have failed 44071 1727204666.61585: getting the remaining hosts for this loop 44071 1727204666.61587: done getting the remaining hosts for this loop 44071 1727204666.61593: getting the next task for host managed-node2 44071 1727204666.61604: done getting next task for host managed-node2 44071 1727204666.61609: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204666.61622: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204666.61653: getting variables 44071 1727204666.61660: in VariableManager get_vars() 44071 1727204666.61737: Calling all_inventory to load vars for managed-node2 44071 1727204666.61740: Calling groups_inventory to load vars for managed-node2 44071 1727204666.61743: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204666.61749: done sending task result for task 127b8e07-fff9-c964-7471-000000001462 44071 1727204666.61752: WORKER PROCESS EXITING 44071 1727204666.61767: Calling all_plugins_play to load vars for managed-node2 44071 1727204666.61770: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204666.61773: Calling groups_plugins_play to load vars for managed-node2 44071 1727204666.64743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204666.67956: done with get_vars() 44071 1727204666.68012: done getting variables 44071 1727204666.68086: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:04:26 -0400 (0:00:00.111) 0:01:18.997 ***** 44071 1727204666.68129: entering _queue_task() for managed-node2/fail 44071 1727204666.68801: worker is 1 (out of 1 available) 44071 1727204666.68813: exiting _queue_task() for managed-node2/fail 44071 1727204666.68826: done queuing things up, now waiting for results queue to drain 44071 1727204666.68828: waiting for pending results... 44071 1727204666.68982: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204666.69274: in run() - task 127b8e07-fff9-c964-7471-000000001463 44071 1727204666.69280: variable 'ansible_search_path' from source: unknown 44071 1727204666.69283: variable 'ansible_search_path' from source: unknown 44071 1727204666.69287: calling self._execute() 44071 1727204666.69366: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204666.69386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204666.69404: variable 'omit' from source: magic vars 44071 1727204666.69852: variable 'ansible_distribution_major_version' from source: facts 44071 1727204666.69877: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204666.70229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204666.72812: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204666.72901: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204666.72949: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204666.73284: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204666.73288: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204666.73329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204666.74474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204666.74508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204666.74562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204666.74828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204666.74901: variable 'ansible_distribution_major_version' from source: facts 44071 1727204666.74958: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44071 1727204666.75204: variable 'ansible_distribution' from source: facts 44071 1727204666.75273: variable '__network_rh_distros' from source: role '' defaults 44071 1727204666.75292: Evaluated conditional (ansible_distribution in __network_rh_distros): False 44071 1727204666.75300: when evaluation is False, skipping this task 44071 1727204666.75308: _execute() done 44071 1727204666.75316: dumping result to json 44071 1727204666.75483: done dumping result, returning 44071 1727204666.75487: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-c964-7471-000000001463] 44071 1727204666.75489: sending task result for task 127b8e07-fff9-c964-7471-000000001463 44071 1727204666.75576: done sending task result for task 127b8e07-fff9-c964-7471-000000001463 44071 1727204666.75580: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 44071 1727204666.75641: no more pending results, returning what we have 44071 1727204666.75645: results queue empty 44071 1727204666.75647: checking for any_errors_fatal 44071 1727204666.75655: done checking for any_errors_fatal 44071 1727204666.75656: checking for max_fail_percentage 44071 1727204666.75658: done checking for max_fail_percentage 44071 1727204666.75659: checking to see if all hosts have failed and the running result is not ok 44071 1727204666.75660: done checking to see if all hosts have failed 44071 1727204666.75661: getting the remaining hosts for this loop 44071 1727204666.75662: done getting the remaining hosts for this loop 44071 1727204666.75670: getting the next task for host managed-node2 44071 1727204666.75680: done getting next task for host managed-node2 44071 1727204666.75685: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204666.75691: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204666.75716: getting variables 44071 1727204666.75719: in VariableManager get_vars() 44071 1727204666.75972: Calling all_inventory to load vars for managed-node2 44071 1727204666.75976: Calling groups_inventory to load vars for managed-node2 44071 1727204666.75979: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204666.75993: Calling all_plugins_play to load vars for managed-node2 44071 1727204666.75996: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204666.76000: Calling groups_plugins_play to load vars for managed-node2 44071 1727204666.80438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204666.85621: done with get_vars() 44071 1727204666.85872: done getting variables 44071 1727204666.85943: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:04:26 -0400 (0:00:00.178) 0:01:19.176 ***** 44071 1727204666.85982: entering _queue_task() for managed-node2/dnf 44071 1727204666.86939: worker is 1 (out of 1 available) 44071 1727204666.86953: exiting _queue_task() for managed-node2/dnf 44071 1727204666.87169: done queuing things up, now waiting for results queue to drain 44071 1727204666.87172: waiting for pending results... 44071 1727204666.87792: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204666.88132: in run() - task 127b8e07-fff9-c964-7471-000000001464 44071 1727204666.88137: variable 'ansible_search_path' from source: unknown 44071 1727204666.88139: variable 'ansible_search_path' from source: unknown 44071 1727204666.88238: calling self._execute() 44071 1727204666.88406: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204666.88446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204666.88679: variable 'omit' from source: magic vars 44071 1727204666.90074: variable 'ansible_distribution_major_version' from source: facts 44071 1727204666.90082: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204666.90873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204666.99227: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204666.99374: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204666.99617: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204666.99866: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204666.99873: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204667.00029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204667.00277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204667.00281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.00502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204667.00633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204667.00839: variable 'ansible_distribution' from source: facts 44071 1727204667.00849: variable 'ansible_distribution_major_version' from source: facts 44071 1727204667.00852: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44071 1727204667.01626: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204667.02161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204667.02176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204667.02204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.02376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204667.02595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204667.02769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204667.02775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204667.02777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.02944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204667.02958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204667.03123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204667.03249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204667.03253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.03309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204667.03324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204667.03670: variable 'network_connections' from source: include params 44071 1727204667.03746: variable 'interface' from source: play vars 44071 1727204667.03899: variable 'interface' from source: play vars 44071 1727204667.04101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204667.04496: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204667.04545: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204667.04774: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204667.04778: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204667.04871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204667.04896: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204667.04922: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.05067: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204667.05125: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204667.05818: variable 'network_connections' from source: include params 44071 1727204667.05823: variable 'interface' from source: play vars 44071 1727204667.06089: variable 'interface' from source: play vars 44071 1727204667.06093: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204667.06096: when evaluation is False, skipping this task 44071 1727204667.06098: _execute() done 44071 1727204667.06100: dumping result to json 44071 1727204667.06102: done dumping result, returning 44071 1727204667.06104: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001464] 44071 1727204667.06107: sending task result for task 127b8e07-fff9-c964-7471-000000001464 44071 1727204667.06358: done sending task result for task 127b8e07-fff9-c964-7471-000000001464 44071 1727204667.06362: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204667.06425: no more pending results, returning what we have 44071 1727204667.06430: results queue empty 44071 1727204667.06431: checking for any_errors_fatal 44071 1727204667.06442: done checking for any_errors_fatal 44071 1727204667.06443: checking for max_fail_percentage 44071 1727204667.06444: done checking for max_fail_percentage 44071 1727204667.06445: checking to see if all hosts have failed and the running result is not ok 44071 1727204667.06446: done checking to see if all hosts have failed 44071 1727204667.06447: getting the remaining hosts for this loop 44071 1727204667.06449: done getting the remaining hosts for this loop 44071 1727204667.06455: getting the next task for host managed-node2 44071 1727204667.06468: done getting next task for host managed-node2 44071 1727204667.06473: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204667.06478: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204667.06505: getting variables 44071 1727204667.06507: in VariableManager get_vars() 44071 1727204667.06552: Calling all_inventory to load vars for managed-node2 44071 1727204667.06555: Calling groups_inventory to load vars for managed-node2 44071 1727204667.06557: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204667.06775: Calling all_plugins_play to load vars for managed-node2 44071 1727204667.06780: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204667.06784: Calling groups_plugins_play to load vars for managed-node2 44071 1727204667.10645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204667.15561: done with get_vars() 44071 1727204667.15606: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204667.15896: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:04:27 -0400 (0:00:00.299) 0:01:19.475 ***** 44071 1727204667.15934: entering _queue_task() for managed-node2/yum 44071 1727204667.16779: worker is 1 (out of 1 available) 44071 1727204667.16792: exiting _queue_task() for managed-node2/yum 44071 1727204667.16805: done queuing things up, now waiting for results queue to drain 44071 1727204667.16807: waiting for pending results... 44071 1727204667.17592: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204667.17955: in run() - task 127b8e07-fff9-c964-7471-000000001465 44071 1727204667.18063: variable 'ansible_search_path' from source: unknown 44071 1727204667.18074: variable 'ansible_search_path' from source: unknown 44071 1727204667.18134: calling self._execute() 44071 1727204667.18493: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204667.18502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204667.18507: variable 'omit' from source: magic vars 44071 1727204667.19586: variable 'ansible_distribution_major_version' from source: facts 44071 1727204667.19614: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204667.20257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204667.23363: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204667.23463: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204667.23521: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204667.23687: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204667.23691: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204667.23973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204667.24104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204667.24141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.24309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204667.24316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204667.24601: variable 'ansible_distribution_major_version' from source: facts 44071 1727204667.24663: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44071 1727204667.24746: when evaluation is False, skipping this task 44071 1727204667.24754: _execute() done 44071 1727204667.24758: dumping result to json 44071 1727204667.24760: done dumping result, returning 44071 1727204667.24763: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001465] 44071 1727204667.24858: sending task result for task 127b8e07-fff9-c964-7471-000000001465 44071 1727204667.24954: done sending task result for task 127b8e07-fff9-c964-7471-000000001465 44071 1727204667.25148: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44071 1727204667.25209: no more pending results, returning what we have 44071 1727204667.25213: results queue empty 44071 1727204667.25215: checking for any_errors_fatal 44071 1727204667.25223: done checking for any_errors_fatal 44071 1727204667.25224: checking for max_fail_percentage 44071 1727204667.25226: done checking for max_fail_percentage 44071 1727204667.25227: checking to see if all hosts have failed and the running result is not ok 44071 1727204667.25227: done checking to see if all hosts have failed 44071 1727204667.25228: getting the remaining hosts for this loop 44071 1727204667.25230: done getting the remaining hosts for this loop 44071 1727204667.25235: getting the next task for host managed-node2 44071 1727204667.25244: done getting next task for host managed-node2 44071 1727204667.25248: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204667.25253: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204667.25282: getting variables 44071 1727204667.25284: in VariableManager get_vars() 44071 1727204667.25329: Calling all_inventory to load vars for managed-node2 44071 1727204667.25332: Calling groups_inventory to load vars for managed-node2 44071 1727204667.25335: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204667.25347: Calling all_plugins_play to load vars for managed-node2 44071 1727204667.25351: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204667.25354: Calling groups_plugins_play to load vars for managed-node2 44071 1727204667.27247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204667.29856: done with get_vars() 44071 1727204667.30111: done getting variables 44071 1727204667.30182: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:04:27 -0400 (0:00:00.143) 0:01:19.619 ***** 44071 1727204667.30326: entering _queue_task() for managed-node2/fail 44071 1727204667.30794: worker is 1 (out of 1 available) 44071 1727204667.30808: exiting _queue_task() for managed-node2/fail 44071 1727204667.30825: done queuing things up, now waiting for results queue to drain 44071 1727204667.30827: waiting for pending results... 44071 1727204667.31484: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204667.31847: in run() - task 127b8e07-fff9-c964-7471-000000001466 44071 1727204667.31862: variable 'ansible_search_path' from source: unknown 44071 1727204667.31896: variable 'ansible_search_path' from source: unknown 44071 1727204667.32086: calling self._execute() 44071 1727204667.32239: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204667.32243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204667.32257: variable 'omit' from source: magic vars 44071 1727204667.33205: variable 'ansible_distribution_major_version' from source: facts 44071 1727204667.33220: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204667.33607: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204667.33975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204667.36382: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204667.36452: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204667.36493: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204667.36536: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204667.36556: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204667.36647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204667.37407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204667.37495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.37618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204667.37621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204667.37688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204667.37691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204667.37715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.37775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204667.38077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204667.38133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204667.38158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204667.38185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.38229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204667.38267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204667.38635: variable 'network_connections' from source: include params 44071 1727204667.38647: variable 'interface' from source: play vars 44071 1727204667.38874: variable 'interface' from source: play vars 44071 1727204667.38878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204667.39078: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204667.39083: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204667.39096: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204667.39119: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204667.39193: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204667.39197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204667.39298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.39301: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204667.39304: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204667.39783: variable 'network_connections' from source: include params 44071 1727204667.39786: variable 'interface' from source: play vars 44071 1727204667.39788: variable 'interface' from source: play vars 44071 1727204667.39790: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204667.39792: when evaluation is False, skipping this task 44071 1727204667.39794: _execute() done 44071 1727204667.39796: dumping result to json 44071 1727204667.39797: done dumping result, returning 44071 1727204667.39799: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001466] 44071 1727204667.39801: sending task result for task 127b8e07-fff9-c964-7471-000000001466 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204667.39979: no more pending results, returning what we have 44071 1727204667.39982: results queue empty 44071 1727204667.39983: checking for any_errors_fatal 44071 1727204667.39990: done checking for any_errors_fatal 44071 1727204667.39991: checking for max_fail_percentage 44071 1727204667.39992: done checking for max_fail_percentage 44071 1727204667.39993: checking to see if all hosts have failed and the running result is not ok 44071 1727204667.39994: done checking to see if all hosts have failed 44071 1727204667.39994: getting the remaining hosts for this loop 44071 1727204667.39996: done getting the remaining hosts for this loop 44071 1727204667.40000: getting the next task for host managed-node2 44071 1727204667.40008: done getting next task for host managed-node2 44071 1727204667.40012: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44071 1727204667.40018: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204667.40040: getting variables 44071 1727204667.40041: in VariableManager get_vars() 44071 1727204667.40285: Calling all_inventory to load vars for managed-node2 44071 1727204667.40288: Calling groups_inventory to load vars for managed-node2 44071 1727204667.40290: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204667.40297: done sending task result for task 127b8e07-fff9-c964-7471-000000001466 44071 1727204667.40299: WORKER PROCESS EXITING 44071 1727204667.40311: Calling all_plugins_play to load vars for managed-node2 44071 1727204667.40314: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204667.40316: Calling groups_plugins_play to load vars for managed-node2 44071 1727204667.43751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204667.46209: done with get_vars() 44071 1727204667.46386: done getting variables 44071 1727204667.46505: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:04:27 -0400 (0:00:00.162) 0:01:19.782 ***** 44071 1727204667.46630: entering _queue_task() for managed-node2/package 44071 1727204667.47740: worker is 1 (out of 1 available) 44071 1727204667.47756: exiting _queue_task() for managed-node2/package 44071 1727204667.47776: done queuing things up, now waiting for results queue to drain 44071 1727204667.47778: waiting for pending results... 44071 1727204667.48082: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 44071 1727204667.48428: in run() - task 127b8e07-fff9-c964-7471-000000001467 44071 1727204667.48458: variable 'ansible_search_path' from source: unknown 44071 1727204667.48463: variable 'ansible_search_path' from source: unknown 44071 1727204667.48536: calling self._execute() 44071 1727204667.48644: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204667.48657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204667.48709: variable 'omit' from source: magic vars 44071 1727204667.49159: variable 'ansible_distribution_major_version' from source: facts 44071 1727204667.49183: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204667.49451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204667.49805: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204667.49901: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204667.49923: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204667.50035: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204667.50227: variable 'network_packages' from source: role '' defaults 44071 1727204667.50351: variable '__network_provider_setup' from source: role '' defaults 44071 1727204667.50373: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204667.50457: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204667.50474: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204667.50558: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204667.50809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204667.53740: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204667.53848: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204667.53908: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204667.54013: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204667.54211: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204667.54261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204667.54354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204667.54420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.54554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204667.54581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204667.54762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204667.54768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204667.54770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.54822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204667.54920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204667.55400: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204667.55764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204667.55904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204667.55978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.56129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204667.56134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204667.56250: variable 'ansible_python' from source: facts 44071 1727204667.56355: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204667.56493: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204667.56606: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204667.56912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204667.56964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204667.57055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.57147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204667.57220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204667.57264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204667.57367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204667.57439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.57553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204667.57565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204667.57778: variable 'network_connections' from source: include params 44071 1727204667.57791: variable 'interface' from source: play vars 44071 1727204667.58090: variable 'interface' from source: play vars 44071 1727204667.58134: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204667.58178: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204667.58246: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204667.58294: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204667.58374: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204667.58880: variable 'network_connections' from source: include params 44071 1727204667.58967: variable 'interface' from source: play vars 44071 1727204667.59501: variable 'interface' from source: play vars 44071 1727204667.59505: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204667.59508: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204667.59920: variable 'network_connections' from source: include params 44071 1727204667.59941: variable 'interface' from source: play vars 44071 1727204667.60026: variable 'interface' from source: play vars 44071 1727204667.60072: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204667.60181: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204667.60577: variable 'network_connections' from source: include params 44071 1727204667.60592: variable 'interface' from source: play vars 44071 1727204667.60667: variable 'interface' from source: play vars 44071 1727204667.60780: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204667.60887: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204667.60895: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204667.60967: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204667.61260: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204667.61917: variable 'network_connections' from source: include params 44071 1727204667.61929: variable 'interface' from source: play vars 44071 1727204667.62105: variable 'interface' from source: play vars 44071 1727204667.62111: variable 'ansible_distribution' from source: facts 44071 1727204667.62114: variable '__network_rh_distros' from source: role '' defaults 44071 1727204667.62116: variable 'ansible_distribution_major_version' from source: facts 44071 1727204667.62119: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204667.62323: variable 'ansible_distribution' from source: facts 44071 1727204667.62326: variable '__network_rh_distros' from source: role '' defaults 44071 1727204667.62329: variable 'ansible_distribution_major_version' from source: facts 44071 1727204667.62334: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204667.62520: variable 'ansible_distribution' from source: facts 44071 1727204667.62541: variable '__network_rh_distros' from source: role '' defaults 44071 1727204667.62561: variable 'ansible_distribution_major_version' from source: facts 44071 1727204667.62685: variable 'network_provider' from source: set_fact 44071 1727204667.62773: variable 'ansible_facts' from source: unknown 44071 1727204667.64584: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44071 1727204667.64594: when evaluation is False, skipping this task 44071 1727204667.64602: _execute() done 44071 1727204667.64613: dumping result to json 44071 1727204667.64630: done dumping result, returning 44071 1727204667.64659: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-c964-7471-000000001467] 44071 1727204667.64674: sending task result for task 127b8e07-fff9-c964-7471-000000001467 44071 1727204667.64957: done sending task result for task 127b8e07-fff9-c964-7471-000000001467 44071 1727204667.64960: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44071 1727204667.65272: no more pending results, returning what we have 44071 1727204667.65277: results queue empty 44071 1727204667.65278: checking for any_errors_fatal 44071 1727204667.65290: done checking for any_errors_fatal 44071 1727204667.65291: checking for max_fail_percentage 44071 1727204667.65293: done checking for max_fail_percentage 44071 1727204667.65294: checking to see if all hosts have failed and the running result is not ok 44071 1727204667.65295: done checking to see if all hosts have failed 44071 1727204667.65296: getting the remaining hosts for this loop 44071 1727204667.65298: done getting the remaining hosts for this loop 44071 1727204667.65303: getting the next task for host managed-node2 44071 1727204667.65313: done getting next task for host managed-node2 44071 1727204667.65318: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204667.65324: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204667.65353: getting variables 44071 1727204667.65355: in VariableManager get_vars() 44071 1727204667.65436: Calling all_inventory to load vars for managed-node2 44071 1727204667.65439: Calling groups_inventory to load vars for managed-node2 44071 1727204667.65442: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204667.65455: Calling all_plugins_play to load vars for managed-node2 44071 1727204667.65458: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204667.65462: Calling groups_plugins_play to load vars for managed-node2 44071 1727204667.68088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204667.71024: done with get_vars() 44071 1727204667.71059: done getting variables 44071 1727204667.71228: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:04:27 -0400 (0:00:00.246) 0:01:20.029 ***** 44071 1727204667.71277: entering _queue_task() for managed-node2/package 44071 1727204667.71930: worker is 1 (out of 1 available) 44071 1727204667.71947: exiting _queue_task() for managed-node2/package 44071 1727204667.71962: done queuing things up, now waiting for results queue to drain 44071 1727204667.71964: waiting for pending results... 44071 1727204667.72623: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204667.72637: in run() - task 127b8e07-fff9-c964-7471-000000001468 44071 1727204667.72660: variable 'ansible_search_path' from source: unknown 44071 1727204667.72672: variable 'ansible_search_path' from source: unknown 44071 1727204667.72760: calling self._execute() 44071 1727204667.72910: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204667.72997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204667.73001: variable 'omit' from source: magic vars 44071 1727204667.73737: variable 'ansible_distribution_major_version' from source: facts 44071 1727204667.73764: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204667.74000: variable 'network_state' from source: role '' defaults 44071 1727204667.74011: Evaluated conditional (network_state != {}): False 44071 1727204667.74028: when evaluation is False, skipping this task 44071 1727204667.74058: _execute() done 44071 1727204667.74144: dumping result to json 44071 1727204667.74148: done dumping result, returning 44071 1727204667.74151: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-c964-7471-000000001468] 44071 1727204667.74156: sending task result for task 127b8e07-fff9-c964-7471-000000001468 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204667.74569: no more pending results, returning what we have 44071 1727204667.74575: results queue empty 44071 1727204667.74576: checking for any_errors_fatal 44071 1727204667.74589: done checking for any_errors_fatal 44071 1727204667.74590: checking for max_fail_percentage 44071 1727204667.74592: done checking for max_fail_percentage 44071 1727204667.74593: checking to see if all hosts have failed and the running result is not ok 44071 1727204667.74594: done checking to see if all hosts have failed 44071 1727204667.74595: getting the remaining hosts for this loop 44071 1727204667.74597: done getting the remaining hosts for this loop 44071 1727204667.74602: getting the next task for host managed-node2 44071 1727204667.74616: done getting next task for host managed-node2 44071 1727204667.74622: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204667.74628: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204667.74660: getting variables 44071 1727204667.74892: in VariableManager get_vars() 44071 1727204667.74944: Calling all_inventory to load vars for managed-node2 44071 1727204667.74947: Calling groups_inventory to load vars for managed-node2 44071 1727204667.74950: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204667.74969: Calling all_plugins_play to load vars for managed-node2 44071 1727204667.74974: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204667.74978: Calling groups_plugins_play to load vars for managed-node2 44071 1727204667.75588: done sending task result for task 127b8e07-fff9-c964-7471-000000001468 44071 1727204667.75594: WORKER PROCESS EXITING 44071 1727204667.93137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204667.95629: done with get_vars() 44071 1727204667.95796: done getting variables 44071 1727204667.95856: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:04:27 -0400 (0:00:00.247) 0:01:20.276 ***** 44071 1727204667.96022: entering _queue_task() for managed-node2/package 44071 1727204667.96941: worker is 1 (out of 1 available) 44071 1727204667.96957: exiting _queue_task() for managed-node2/package 44071 1727204667.97030: done queuing things up, now waiting for results queue to drain 44071 1727204667.97035: waiting for pending results... 44071 1727204667.97476: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204667.97890: in run() - task 127b8e07-fff9-c964-7471-000000001469 44071 1727204667.97922: variable 'ansible_search_path' from source: unknown 44071 1727204667.97926: variable 'ansible_search_path' from source: unknown 44071 1727204667.97950: calling self._execute() 44071 1727204667.98403: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204667.98410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204667.98422: variable 'omit' from source: magic vars 44071 1727204667.99304: variable 'ansible_distribution_major_version' from source: facts 44071 1727204667.99331: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204667.99692: variable 'network_state' from source: role '' defaults 44071 1727204667.99786: Evaluated conditional (network_state != {}): False 44071 1727204667.99791: when evaluation is False, skipping this task 44071 1727204667.99799: _execute() done 44071 1727204667.99803: dumping result to json 44071 1727204667.99806: done dumping result, returning 44071 1727204667.99814: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-c964-7471-000000001469] 44071 1727204667.99845: sending task result for task 127b8e07-fff9-c964-7471-000000001469 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204668.00196: no more pending results, returning what we have 44071 1727204668.00201: results queue empty 44071 1727204668.00202: checking for any_errors_fatal 44071 1727204668.00213: done checking for any_errors_fatal 44071 1727204668.00214: checking for max_fail_percentage 44071 1727204668.00216: done checking for max_fail_percentage 44071 1727204668.00218: checking to see if all hosts have failed and the running result is not ok 44071 1727204668.00218: done checking to see if all hosts have failed 44071 1727204668.00219: getting the remaining hosts for this loop 44071 1727204668.00221: done getting the remaining hosts for this loop 44071 1727204668.00227: getting the next task for host managed-node2 44071 1727204668.00242: done getting next task for host managed-node2 44071 1727204668.00247: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204668.00254: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204668.00289: getting variables 44071 1727204668.00292: in VariableManager get_vars() 44071 1727204668.00527: Calling all_inventory to load vars for managed-node2 44071 1727204668.00531: Calling groups_inventory to load vars for managed-node2 44071 1727204668.00536: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204668.00545: done sending task result for task 127b8e07-fff9-c964-7471-000000001469 44071 1727204668.00548: WORKER PROCESS EXITING 44071 1727204668.00559: Calling all_plugins_play to load vars for managed-node2 44071 1727204668.00563: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204668.00569: Calling groups_plugins_play to load vars for managed-node2 44071 1727204668.02787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204668.06723: done with get_vars() 44071 1727204668.06986: done getting variables 44071 1727204668.07064: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:04:28 -0400 (0:00:00.110) 0:01:20.387 ***** 44071 1727204668.07114: entering _queue_task() for managed-node2/service 44071 1727204668.08234: worker is 1 (out of 1 available) 44071 1727204668.08251: exiting _queue_task() for managed-node2/service 44071 1727204668.08269: done queuing things up, now waiting for results queue to drain 44071 1727204668.08271: waiting for pending results... 44071 1727204668.08904: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204668.09228: in run() - task 127b8e07-fff9-c964-7471-00000000146a 44071 1727204668.09244: variable 'ansible_search_path' from source: unknown 44071 1727204668.09249: variable 'ansible_search_path' from source: unknown 44071 1727204668.09378: calling self._execute() 44071 1727204668.09449: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204668.09454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204668.09468: variable 'omit' from source: magic vars 44071 1727204668.10386: variable 'ansible_distribution_major_version' from source: facts 44071 1727204668.10391: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204668.10394: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204668.10978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204668.14945: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204668.15057: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204668.15111: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204668.15171: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204668.15205: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204668.15316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204668.15370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204668.15403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204668.15461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204668.15489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204668.15549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204668.15668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204668.15674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204668.15677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204668.15705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204668.15762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204668.15808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204668.15843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204668.15902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204668.15971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204668.16186: variable 'network_connections' from source: include params 44071 1727204668.16211: variable 'interface' from source: play vars 44071 1727204668.16305: variable 'interface' from source: play vars 44071 1727204668.16410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204668.16640: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204668.16711: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204668.16772: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204668.16803: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204668.16883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204668.16910: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204668.16966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204668.16990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204668.17073: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204668.17406: variable 'network_connections' from source: include params 44071 1727204668.17426: variable 'interface' from source: play vars 44071 1727204668.17537: variable 'interface' from source: play vars 44071 1727204668.17557: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204668.17570: when evaluation is False, skipping this task 44071 1727204668.17620: _execute() done 44071 1727204668.17623: dumping result to json 44071 1727204668.17625: done dumping result, returning 44071 1727204668.17628: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-00000000146a] 44071 1727204668.17630: sending task result for task 127b8e07-fff9-c964-7471-00000000146a skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204668.17935: no more pending results, returning what we have 44071 1727204668.17939: results queue empty 44071 1727204668.17940: checking for any_errors_fatal 44071 1727204668.17949: done checking for any_errors_fatal 44071 1727204668.17950: checking for max_fail_percentage 44071 1727204668.17952: done checking for max_fail_percentage 44071 1727204668.17953: checking to see if all hosts have failed and the running result is not ok 44071 1727204668.17954: done checking to see if all hosts have failed 44071 1727204668.17955: getting the remaining hosts for this loop 44071 1727204668.17956: done getting the remaining hosts for this loop 44071 1727204668.17962: getting the next task for host managed-node2 44071 1727204668.17973: done getting next task for host managed-node2 44071 1727204668.17981: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204668.17986: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204668.18013: getting variables 44071 1727204668.18016: in VariableManager get_vars() 44071 1727204668.18187: Calling all_inventory to load vars for managed-node2 44071 1727204668.18190: Calling groups_inventory to load vars for managed-node2 44071 1727204668.18193: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204668.18201: done sending task result for task 127b8e07-fff9-c964-7471-00000000146a 44071 1727204668.18204: WORKER PROCESS EXITING 44071 1727204668.18215: Calling all_plugins_play to load vars for managed-node2 44071 1727204668.18219: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204668.18223: Calling groups_plugins_play to load vars for managed-node2 44071 1727204668.20423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204668.23081: done with get_vars() 44071 1727204668.23114: done getting variables 44071 1727204668.23198: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:04:28 -0400 (0:00:00.161) 0:01:20.548 ***** 44071 1727204668.23239: entering _queue_task() for managed-node2/service 44071 1727204668.23873: worker is 1 (out of 1 available) 44071 1727204668.23886: exiting _queue_task() for managed-node2/service 44071 1727204668.23899: done queuing things up, now waiting for results queue to drain 44071 1727204668.23901: waiting for pending results... 44071 1727204668.24103: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204668.24297: in run() - task 127b8e07-fff9-c964-7471-00000000146b 44071 1727204668.24327: variable 'ansible_search_path' from source: unknown 44071 1727204668.24349: variable 'ansible_search_path' from source: unknown 44071 1727204668.24404: calling self._execute() 44071 1727204668.24534: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204668.24554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204668.24580: variable 'omit' from source: magic vars 44071 1727204668.25079: variable 'ansible_distribution_major_version' from source: facts 44071 1727204668.25170: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204668.25337: variable 'network_provider' from source: set_fact 44071 1727204668.25350: variable 'network_state' from source: role '' defaults 44071 1727204668.25367: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44071 1727204668.25380: variable 'omit' from source: magic vars 44071 1727204668.25484: variable 'omit' from source: magic vars 44071 1727204668.25522: variable 'network_service_name' from source: role '' defaults 44071 1727204668.25619: variable 'network_service_name' from source: role '' defaults 44071 1727204668.25772: variable '__network_provider_setup' from source: role '' defaults 44071 1727204668.25784: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204668.25877: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204668.25880: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204668.25986: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204668.26248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204668.28999: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204668.29128: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204668.29178: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204668.29238: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204668.29269: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204668.29571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204668.29576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204668.29578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204668.29580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204668.29583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204668.29584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204668.29609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204668.29645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204668.29705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204668.29728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204668.30029: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204668.30188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204668.30220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204668.30266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204668.30315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204668.30356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204668.30464: variable 'ansible_python' from source: facts 44071 1727204668.30575: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204668.30600: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204668.30705: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204668.30872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204668.30918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204668.31012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204668.31021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204668.31041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204668.31106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204668.31159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204668.31194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204668.31255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204668.31341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204668.31474: variable 'network_connections' from source: include params 44071 1727204668.31489: variable 'interface' from source: play vars 44071 1727204668.31592: variable 'interface' from source: play vars 44071 1727204668.31734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204668.32011: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204668.32079: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204668.32145: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204668.32197: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204668.32323: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204668.32336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204668.32381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204668.32572: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204668.32608: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204668.33000: variable 'network_connections' from source: include params 44071 1727204668.33013: variable 'interface' from source: play vars 44071 1727204668.33118: variable 'interface' from source: play vars 44071 1727204668.33162: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204668.33274: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204668.33868: variable 'network_connections' from source: include params 44071 1727204668.33882: variable 'interface' from source: play vars 44071 1727204668.34041: variable 'interface' from source: play vars 44071 1727204668.34045: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204668.34177: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204668.34543: variable 'network_connections' from source: include params 44071 1727204668.34555: variable 'interface' from source: play vars 44071 1727204668.34828: variable 'interface' from source: play vars 44071 1727204668.34935: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204668.35063: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204668.35080: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204668.35253: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204668.35787: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204668.37260: variable 'network_connections' from source: include params 44071 1727204668.37341: variable 'interface' from source: play vars 44071 1727204668.37589: variable 'interface' from source: play vars 44071 1727204668.37597: variable 'ansible_distribution' from source: facts 44071 1727204668.37600: variable '__network_rh_distros' from source: role '' defaults 44071 1727204668.37603: variable 'ansible_distribution_major_version' from source: facts 44071 1727204668.37642: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204668.38048: variable 'ansible_distribution' from source: facts 44071 1727204668.38059: variable '__network_rh_distros' from source: role '' defaults 44071 1727204668.38159: variable 'ansible_distribution_major_version' from source: facts 44071 1727204668.38163: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204668.38323: variable 'ansible_distribution' from source: facts 44071 1727204668.38337: variable '__network_rh_distros' from source: role '' defaults 44071 1727204668.38353: variable 'ansible_distribution_major_version' from source: facts 44071 1727204668.38406: variable 'network_provider' from source: set_fact 44071 1727204668.38440: variable 'omit' from source: magic vars 44071 1727204668.38489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204668.38525: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204668.38553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204668.38592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204668.38670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204668.38674: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204668.38677: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204668.38679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204668.39400: Set connection var ansible_connection to ssh 44071 1727204668.39404: Set connection var ansible_timeout to 10 44071 1727204668.39407: Set connection var ansible_pipelining to False 44071 1727204668.39409: Set connection var ansible_shell_type to sh 44071 1727204668.39411: Set connection var ansible_shell_executable to /bin/sh 44071 1727204668.39413: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204668.39641: variable 'ansible_shell_executable' from source: unknown 44071 1727204668.39645: variable 'ansible_connection' from source: unknown 44071 1727204668.39647: variable 'ansible_module_compression' from source: unknown 44071 1727204668.39649: variable 'ansible_shell_type' from source: unknown 44071 1727204668.39723: variable 'ansible_shell_executable' from source: unknown 44071 1727204668.39727: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204668.39729: variable 'ansible_pipelining' from source: unknown 44071 1727204668.39734: variable 'ansible_timeout' from source: unknown 44071 1727204668.39736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204668.40375: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204668.40384: variable 'omit' from source: magic vars 44071 1727204668.40388: starting attempt loop 44071 1727204668.40390: running the handler 44071 1727204668.40457: variable 'ansible_facts' from source: unknown 44071 1727204668.42790: _low_level_execute_command(): starting 44071 1727204668.42810: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204668.43927: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204668.43979: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204668.44198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204668.44216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204668.44323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204668.46109: stdout chunk (state=3): >>>/root <<< 44071 1727204668.46212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204668.46523: stderr chunk (state=3): >>><<< 44071 1727204668.46527: stdout chunk (state=3): >>><<< 44071 1727204668.46530: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204668.46539: _low_level_execute_command(): starting 44071 1727204668.46553: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204668.4650319-48478-128451691473679 `" && echo ansible-tmp-1727204668.4650319-48478-128451691473679="` echo /root/.ansible/tmp/ansible-tmp-1727204668.4650319-48478-128451691473679 `" ) && sleep 0' 44071 1727204668.48117: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204668.48289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204668.48457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204668.48484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204668.48596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204668.51234: stdout chunk (state=3): >>>ansible-tmp-1727204668.4650319-48478-128451691473679=/root/.ansible/tmp/ansible-tmp-1727204668.4650319-48478-128451691473679 <<< 44071 1727204668.51239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204668.51241: stdout chunk (state=3): >>><<< 44071 1727204668.51244: stderr chunk (state=3): >>><<< 44071 1727204668.51246: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204668.4650319-48478-128451691473679=/root/.ansible/tmp/ansible-tmp-1727204668.4650319-48478-128451691473679 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204668.51248: variable 'ansible_module_compression' from source: unknown 44071 1727204668.51476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44071 1727204668.51678: variable 'ansible_facts' from source: unknown 44071 1727204668.52129: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204668.4650319-48478-128451691473679/AnsiballZ_systemd.py 44071 1727204668.52587: Sending initial data 44071 1727204668.52598: Sent initial data (156 bytes) 44071 1727204668.53999: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204668.54019: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204668.54039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204668.54102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204668.54211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204668.54316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204668.54330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204668.54440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204668.54505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204668.56211: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204668.56400: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204668.56653: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp8i_9ykmy /root/.ansible/tmp/ansible-tmp-1727204668.4650319-48478-128451691473679/AnsiballZ_systemd.py <<< 44071 1727204668.56800: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204668.4650319-48478-128451691473679/AnsiballZ_systemd.py" <<< 44071 1727204668.57146: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp8i_9ykmy" to remote "/root/.ansible/tmp/ansible-tmp-1727204668.4650319-48478-128451691473679/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204668.4650319-48478-128451691473679/AnsiballZ_systemd.py" <<< 44071 1727204668.60892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204668.61057: stderr chunk (state=3): >>><<< 44071 1727204668.61077: stdout chunk (state=3): >>><<< 44071 1727204668.61112: done transferring module to remote 44071 1727204668.61188: _low_level_execute_command(): starting 44071 1727204668.61284: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204668.4650319-48478-128451691473679/ /root/.ansible/tmp/ansible-tmp-1727204668.4650319-48478-128451691473679/AnsiballZ_systemd.py && sleep 0' 44071 1727204668.62520: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204668.62551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204668.62784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204668.62890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204668.63078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204668.65158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204668.65300: stderr chunk (state=3): >>><<< 44071 1727204668.65305: stdout chunk (state=3): >>><<< 44071 1727204668.65408: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204668.65412: _low_level_execute_command(): starting 44071 1727204668.65415: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204668.4650319-48478-128451691473679/AnsiballZ_systemd.py && sleep 0' 44071 1727204668.67206: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204668.67250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204668.67293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204668.67371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204668.67494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204668.99440: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4591616", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3508576256", "CPUUsageNSec": "1567943000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitC<<< 44071 1727204668.99498: stdout chunk (state=3): >>>ORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44071 1727204669.01438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204669.01443: stdout chunk (state=3): >>><<< 44071 1727204669.01445: stderr chunk (state=3): >>><<< 44071 1727204669.01675: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4591616", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3508576256", "CPUUsageNSec": "1567943000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204669.01723: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204668.4650319-48478-128451691473679/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204669.01756: _low_level_execute_command(): starting 44071 1727204669.01770: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204668.4650319-48478-128451691473679/ > /dev/null 2>&1 && sleep 0' 44071 1727204669.02447: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204669.02464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204669.02483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204669.02502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204669.02518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204669.02529: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204669.02544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204669.02563: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204669.02587: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204669.02597: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204669.02608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204669.02619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204669.02636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204669.02648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204669.02729: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204669.02753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204669.02862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204669.04883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204669.04901: stdout chunk (state=3): >>><<< 44071 1727204669.04914: stderr chunk (state=3): >>><<< 44071 1727204669.04938: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204669.04953: handler run complete 44071 1727204669.05040: attempt loop complete, returning result 44071 1727204669.05048: _execute() done 44071 1727204669.05057: dumping result to json 44071 1727204669.05087: done dumping result, returning 44071 1727204669.05104: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-c964-7471-00000000146b] 44071 1727204669.05118: sending task result for task 127b8e07-fff9-c964-7471-00000000146b ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204669.05796: done sending task result for task 127b8e07-fff9-c964-7471-00000000146b 44071 1727204669.05800: WORKER PROCESS EXITING 44071 1727204669.05857: no more pending results, returning what we have 44071 1727204669.05861: results queue empty 44071 1727204669.05862: checking for any_errors_fatal 44071 1727204669.05872: done checking for any_errors_fatal 44071 1727204669.05873: checking for max_fail_percentage 44071 1727204669.05874: done checking for max_fail_percentage 44071 1727204669.05876: checking to see if all hosts have failed and the running result is not ok 44071 1727204669.05877: done checking to see if all hosts have failed 44071 1727204669.05877: getting the remaining hosts for this loop 44071 1727204669.05879: done getting the remaining hosts for this loop 44071 1727204669.05883: getting the next task for host managed-node2 44071 1727204669.05891: done getting next task for host managed-node2 44071 1727204669.05895: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204669.05901: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204669.05916: getting variables 44071 1727204669.05918: in VariableManager get_vars() 44071 1727204669.06179: Calling all_inventory to load vars for managed-node2 44071 1727204669.06182: Calling groups_inventory to load vars for managed-node2 44071 1727204669.06184: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204669.06196: Calling all_plugins_play to load vars for managed-node2 44071 1727204669.06199: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204669.06202: Calling groups_plugins_play to load vars for managed-node2 44071 1727204669.08836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204669.11957: done with get_vars() 44071 1727204669.12011: done getting variables 44071 1727204669.12092: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:04:29 -0400 (0:00:00.888) 0:01:21.437 ***** 44071 1727204669.12142: entering _queue_task() for managed-node2/service 44071 1727204669.12621: worker is 1 (out of 1 available) 44071 1727204669.12640: exiting _queue_task() for managed-node2/service 44071 1727204669.12876: done queuing things up, now waiting for results queue to drain 44071 1727204669.12879: waiting for pending results... 44071 1727204669.13090: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204669.13269: in run() - task 127b8e07-fff9-c964-7471-00000000146c 44071 1727204669.13397: variable 'ansible_search_path' from source: unknown 44071 1727204669.13420: variable 'ansible_search_path' from source: unknown 44071 1727204669.13643: calling self._execute() 44071 1727204669.13814: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204669.13856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204669.13877: variable 'omit' from source: magic vars 44071 1727204669.14574: variable 'ansible_distribution_major_version' from source: facts 44071 1727204669.14600: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204669.14778: variable 'network_provider' from source: set_fact 44071 1727204669.14799: Evaluated conditional (network_provider == "nm"): True 44071 1727204669.15008: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204669.15083: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204669.15315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204669.19419: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204669.19540: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204669.19945: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204669.19950: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204669.19953: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204669.20241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204669.20282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204669.20308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204669.20436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204669.20440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204669.20605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204669.20700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204669.20759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204669.20886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204669.20947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204669.21164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204669.21250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204669.21428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204669.21479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204669.21499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204669.22152: variable 'network_connections' from source: include params 44071 1727204669.22164: variable 'interface' from source: play vars 44071 1727204669.22571: variable 'interface' from source: play vars 44071 1727204669.22897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204669.23470: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204669.23515: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204669.23725: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204669.23756: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204669.24001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204669.24034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204669.24060: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204669.24111: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204669.24341: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204669.24909: variable 'network_connections' from source: include params 44071 1727204669.24917: variable 'interface' from source: play vars 44071 1727204669.25157: variable 'interface' from source: play vars 44071 1727204669.25238: Evaluated conditional (__network_wpa_supplicant_required): False 44071 1727204669.25246: when evaluation is False, skipping this task 44071 1727204669.25251: _execute() done 44071 1727204669.25256: dumping result to json 44071 1727204669.25258: done dumping result, returning 44071 1727204669.25269: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-c964-7471-00000000146c] 44071 1727204669.25282: sending task result for task 127b8e07-fff9-c964-7471-00000000146c 44071 1727204669.25548: done sending task result for task 127b8e07-fff9-c964-7471-00000000146c 44071 1727204669.25551: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44071 1727204669.25691: no more pending results, returning what we have 44071 1727204669.25694: results queue empty 44071 1727204669.25726: checking for any_errors_fatal 44071 1727204669.25752: done checking for any_errors_fatal 44071 1727204669.25753: checking for max_fail_percentage 44071 1727204669.25756: done checking for max_fail_percentage 44071 1727204669.25757: checking to see if all hosts have failed and the running result is not ok 44071 1727204669.25758: done checking to see if all hosts have failed 44071 1727204669.25759: getting the remaining hosts for this loop 44071 1727204669.25760: done getting the remaining hosts for this loop 44071 1727204669.25838: getting the next task for host managed-node2 44071 1727204669.25846: done getting next task for host managed-node2 44071 1727204669.25851: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204669.25856: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204669.25882: getting variables 44071 1727204669.25884: in VariableManager get_vars() 44071 1727204669.26027: Calling all_inventory to load vars for managed-node2 44071 1727204669.26042: Calling groups_inventory to load vars for managed-node2 44071 1727204669.26055: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204669.26082: Calling all_plugins_play to load vars for managed-node2 44071 1727204669.26095: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204669.26106: Calling groups_plugins_play to load vars for managed-node2 44071 1727204669.28622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204669.31384: done with get_vars() 44071 1727204669.31436: done getting variables 44071 1727204669.31546: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:04:29 -0400 (0:00:00.194) 0:01:21.632 ***** 44071 1727204669.31605: entering _queue_task() for managed-node2/service 44071 1727204669.32296: worker is 1 (out of 1 available) 44071 1727204669.32312: exiting _queue_task() for managed-node2/service 44071 1727204669.32326: done queuing things up, now waiting for results queue to drain 44071 1727204669.32327: waiting for pending results... 44071 1727204669.32586: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204669.32789: in run() - task 127b8e07-fff9-c964-7471-00000000146d 44071 1727204669.32797: variable 'ansible_search_path' from source: unknown 44071 1727204669.32802: variable 'ansible_search_path' from source: unknown 44071 1727204669.32837: calling self._execute() 44071 1727204669.32948: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204669.33051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204669.33056: variable 'omit' from source: magic vars 44071 1727204669.33672: variable 'ansible_distribution_major_version' from source: facts 44071 1727204669.33701: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204669.33880: variable 'network_provider' from source: set_fact 44071 1727204669.33943: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204669.33946: when evaluation is False, skipping this task 44071 1727204669.33948: _execute() done 44071 1727204669.33951: dumping result to json 44071 1727204669.33953: done dumping result, returning 44071 1727204669.33956: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-c964-7471-00000000146d] 44071 1727204669.33960: sending task result for task 127b8e07-fff9-c964-7471-00000000146d 44071 1727204669.34051: done sending task result for task 127b8e07-fff9-c964-7471-00000000146d 44071 1727204669.34055: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204669.34113: no more pending results, returning what we have 44071 1727204669.34118: results queue empty 44071 1727204669.34119: checking for any_errors_fatal 44071 1727204669.34132: done checking for any_errors_fatal 44071 1727204669.34133: checking for max_fail_percentage 44071 1727204669.34135: done checking for max_fail_percentage 44071 1727204669.34136: checking to see if all hosts have failed and the running result is not ok 44071 1727204669.34137: done checking to see if all hosts have failed 44071 1727204669.34138: getting the remaining hosts for this loop 44071 1727204669.34141: done getting the remaining hosts for this loop 44071 1727204669.34146: getting the next task for host managed-node2 44071 1727204669.34157: done getting next task for host managed-node2 44071 1727204669.34163: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204669.34171: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204669.34204: getting variables 44071 1727204669.34206: in VariableManager get_vars() 44071 1727204669.34252: Calling all_inventory to load vars for managed-node2 44071 1727204669.34463: Calling groups_inventory to load vars for managed-node2 44071 1727204669.34469: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204669.34525: Calling all_plugins_play to load vars for managed-node2 44071 1727204669.34529: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204669.34533: Calling groups_plugins_play to load vars for managed-node2 44071 1727204669.37084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204669.39722: done with get_vars() 44071 1727204669.39769: done getting variables 44071 1727204669.39841: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:04:29 -0400 (0:00:00.082) 0:01:21.715 ***** 44071 1727204669.39889: entering _queue_task() for managed-node2/copy 44071 1727204669.40488: worker is 1 (out of 1 available) 44071 1727204669.40501: exiting _queue_task() for managed-node2/copy 44071 1727204669.40515: done queuing things up, now waiting for results queue to drain 44071 1727204669.40516: waiting for pending results... 44071 1727204669.40999: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204669.41013: in run() - task 127b8e07-fff9-c964-7471-00000000146e 44071 1727204669.41031: variable 'ansible_search_path' from source: unknown 44071 1727204669.41037: variable 'ansible_search_path' from source: unknown 44071 1727204669.41091: calling self._execute() 44071 1727204669.41260: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204669.41269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204669.41287: variable 'omit' from source: magic vars 44071 1727204669.41812: variable 'ansible_distribution_major_version' from source: facts 44071 1727204669.41826: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204669.41992: variable 'network_provider' from source: set_fact 44071 1727204669.41996: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204669.41999: when evaluation is False, skipping this task 44071 1727204669.42002: _execute() done 44071 1727204669.42074: dumping result to json 44071 1727204669.42078: done dumping result, returning 44071 1727204669.42081: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-c964-7471-00000000146e] 44071 1727204669.42084: sending task result for task 127b8e07-fff9-c964-7471-00000000146e 44071 1727204669.42169: done sending task result for task 127b8e07-fff9-c964-7471-00000000146e 44071 1727204669.42172: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44071 1727204669.42232: no more pending results, returning what we have 44071 1727204669.42237: results queue empty 44071 1727204669.42238: checking for any_errors_fatal 44071 1727204669.42245: done checking for any_errors_fatal 44071 1727204669.42246: checking for max_fail_percentage 44071 1727204669.42248: done checking for max_fail_percentage 44071 1727204669.42249: checking to see if all hosts have failed and the running result is not ok 44071 1727204669.42250: done checking to see if all hosts have failed 44071 1727204669.42251: getting the remaining hosts for this loop 44071 1727204669.42253: done getting the remaining hosts for this loop 44071 1727204669.42258: getting the next task for host managed-node2 44071 1727204669.42458: done getting next task for host managed-node2 44071 1727204669.42463: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204669.42471: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204669.42497: getting variables 44071 1727204669.42499: in VariableManager get_vars() 44071 1727204669.42541: Calling all_inventory to load vars for managed-node2 44071 1727204669.42544: Calling groups_inventory to load vars for managed-node2 44071 1727204669.42547: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204669.42558: Calling all_plugins_play to load vars for managed-node2 44071 1727204669.42561: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204669.42564: Calling groups_plugins_play to load vars for managed-node2 44071 1727204669.44949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204669.47531: done with get_vars() 44071 1727204669.47584: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:04:29 -0400 (0:00:00.077) 0:01:21.793 ***** 44071 1727204669.47682: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204669.48271: worker is 1 (out of 1 available) 44071 1727204669.48286: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204669.48300: done queuing things up, now waiting for results queue to drain 44071 1727204669.48301: waiting for pending results... 44071 1727204669.48664: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204669.48718: in run() - task 127b8e07-fff9-c964-7471-00000000146f 44071 1727204669.48759: variable 'ansible_search_path' from source: unknown 44071 1727204669.48763: variable 'ansible_search_path' from source: unknown 44071 1727204669.48820: calling self._execute() 44071 1727204669.48929: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204669.48973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204669.49038: variable 'omit' from source: magic vars 44071 1727204669.49559: variable 'ansible_distribution_major_version' from source: facts 44071 1727204669.49575: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204669.49582: variable 'omit' from source: magic vars 44071 1727204669.49738: variable 'omit' from source: magic vars 44071 1727204669.50002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204669.54573: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204669.54641: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204669.54713: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204669.54749: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204669.54777: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204669.54877: variable 'network_provider' from source: set_fact 44071 1727204669.55183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204669.55212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204669.55239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204669.55367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204669.55375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204669.55390: variable 'omit' from source: magic vars 44071 1727204669.55517: variable 'omit' from source: magic vars 44071 1727204669.55638: variable 'network_connections' from source: include params 44071 1727204669.55650: variable 'interface' from source: play vars 44071 1727204669.55729: variable 'interface' from source: play vars 44071 1727204669.55935: variable 'omit' from source: magic vars 44071 1727204669.55941: variable '__lsr_ansible_managed' from source: task vars 44071 1727204669.56029: variable '__lsr_ansible_managed' from source: task vars 44071 1727204669.56270: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44071 1727204669.56562: Loaded config def from plugin (lookup/template) 44071 1727204669.56574: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44071 1727204669.56598: File lookup term: get_ansible_managed.j2 44071 1727204669.56601: variable 'ansible_search_path' from source: unknown 44071 1727204669.56604: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44071 1727204669.56671: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44071 1727204669.56683: variable 'ansible_search_path' from source: unknown 44071 1727204669.67217: variable 'ansible_managed' from source: unknown 44071 1727204669.67604: variable 'omit' from source: magic vars 44071 1727204669.67640: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204669.67687: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204669.67707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204669.67730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204669.67737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204669.67770: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204669.67773: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204669.67776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204669.67885: Set connection var ansible_connection to ssh 44071 1727204669.67903: Set connection var ansible_timeout to 10 44071 1727204669.67906: Set connection var ansible_pipelining to False 44071 1727204669.67926: Set connection var ansible_shell_type to sh 44071 1727204669.67929: Set connection var ansible_shell_executable to /bin/sh 44071 1727204669.67932: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204669.67967: variable 'ansible_shell_executable' from source: unknown 44071 1727204669.67970: variable 'ansible_connection' from source: unknown 44071 1727204669.67973: variable 'ansible_module_compression' from source: unknown 44071 1727204669.67976: variable 'ansible_shell_type' from source: unknown 44071 1727204669.67979: variable 'ansible_shell_executable' from source: unknown 44071 1727204669.67981: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204669.67984: variable 'ansible_pipelining' from source: unknown 44071 1727204669.67987: variable 'ansible_timeout' from source: unknown 44071 1727204669.67991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204669.68109: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204669.68122: variable 'omit' from source: magic vars 44071 1727204669.68125: starting attempt loop 44071 1727204669.68128: running the handler 44071 1727204669.68142: _low_level_execute_command(): starting 44071 1727204669.68148: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204669.68724: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204669.68729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204669.68735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204669.68781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204669.68784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204669.68786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204669.68868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204669.70618: stdout chunk (state=3): >>>/root <<< 44071 1727204669.70789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204669.70833: stderr chunk (state=3): >>><<< 44071 1727204669.70864: stdout chunk (state=3): >>><<< 44071 1727204669.70897: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204669.70989: _low_level_execute_command(): starting 44071 1727204669.70994: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204669.7090256-48545-99775747895596 `" && echo ansible-tmp-1727204669.7090256-48545-99775747895596="` echo /root/.ansible/tmp/ansible-tmp-1727204669.7090256-48545-99775747895596 `" ) && sleep 0' 44071 1727204669.71610: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204669.71672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204669.71719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204669.71760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204669.71817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204669.71894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204669.73860: stdout chunk (state=3): >>>ansible-tmp-1727204669.7090256-48545-99775747895596=/root/.ansible/tmp/ansible-tmp-1727204669.7090256-48545-99775747895596 <<< 44071 1727204669.73987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204669.74049: stderr chunk (state=3): >>><<< 44071 1727204669.74052: stdout chunk (state=3): >>><<< 44071 1727204669.74065: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204669.7090256-48545-99775747895596=/root/.ansible/tmp/ansible-tmp-1727204669.7090256-48545-99775747895596 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204669.74114: variable 'ansible_module_compression' from source: unknown 44071 1727204669.74156: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44071 1727204669.74186: variable 'ansible_facts' from source: unknown 44071 1727204669.74254: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204669.7090256-48545-99775747895596/AnsiballZ_network_connections.py 44071 1727204669.74378: Sending initial data 44071 1727204669.74382: Sent initial data (167 bytes) 44071 1727204669.74872: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204669.74875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204669.74924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204669.74981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204669.74990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204669.74993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204669.75090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204669.76792: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204669.76913: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp1exsuid3 /root/.ansible/tmp/ansible-tmp-1727204669.7090256-48545-99775747895596/AnsiballZ_network_connections.py <<< 44071 1727204669.76918: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204669.7090256-48545-99775747895596/AnsiballZ_network_connections.py" <<< 44071 1727204669.77007: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp1exsuid3" to remote "/root/.ansible/tmp/ansible-tmp-1727204669.7090256-48545-99775747895596/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204669.7090256-48545-99775747895596/AnsiballZ_network_connections.py" <<< 44071 1727204669.78820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204669.78950: stderr chunk (state=3): >>><<< 44071 1727204669.78963: stdout chunk (state=3): >>><<< 44071 1727204669.78999: done transferring module to remote 44071 1727204669.79020: _low_level_execute_command(): starting 44071 1727204669.79030: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204669.7090256-48545-99775747895596/ /root/.ansible/tmp/ansible-tmp-1727204669.7090256-48545-99775747895596/AnsiballZ_network_connections.py && sleep 0' 44071 1727204669.79716: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204669.79738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204669.79755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204669.79776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204669.79794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204669.79885: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204669.79908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204669.79926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204669.79957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204669.80172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204669.82005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204669.82124: stderr chunk (state=3): >>><<< 44071 1727204669.82145: stdout chunk (state=3): >>><<< 44071 1727204669.82170: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204669.82180: _low_level_execute_command(): starting 44071 1727204669.82190: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204669.7090256-48545-99775747895596/AnsiballZ_network_connections.py && sleep 0' 44071 1727204669.82912: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204669.82946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204669.82961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204669.83060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204669.83093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204669.83112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204669.83138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204669.83280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204670.12202: stdout chunk (state=3): >>>Traceback (most recent call last):<<< 44071 1727204670.12216: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_dbjk86yr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_dbjk86yr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/beccd2e1-72f3-4d73-aac6-77978c2859f8: error=unknown <<< 44071 1727204670.12410: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44071 1727204670.14456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204670.14520: stderr chunk (state=3): >>><<< 44071 1727204670.14524: stdout chunk (state=3): >>><<< 44071 1727204670.14545: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_dbjk86yr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_dbjk86yr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/beccd2e1-72f3-4d73-aac6-77978c2859f8: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204670.14581: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204669.7090256-48545-99775747895596/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204670.14589: _low_level_execute_command(): starting 44071 1727204670.14594: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204669.7090256-48545-99775747895596/ > /dev/null 2>&1 && sleep 0' 44071 1727204670.15072: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204670.15107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204670.15112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204670.15116: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204670.15118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204670.15120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204670.15175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204670.15178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204670.15180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204670.15256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204670.17328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204670.17374: stderr chunk (state=3): >>><<< 44071 1727204670.17377: stdout chunk (state=3): >>><<< 44071 1727204670.17418: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204670.17426: handler run complete 44071 1727204670.17510: attempt loop complete, returning result 44071 1727204670.17517: _execute() done 44071 1727204670.17520: dumping result to json 44071 1727204670.17522: done dumping result, returning 44071 1727204670.17529: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-c964-7471-00000000146f] 44071 1727204670.17531: sending task result for task 127b8e07-fff9-c964-7471-00000000146f 44071 1727204670.17659: done sending task result for task 127b8e07-fff9-c964-7471-00000000146f 44071 1727204670.17662: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 44071 1727204670.17948: no more pending results, returning what we have 44071 1727204670.17952: results queue empty 44071 1727204670.17953: checking for any_errors_fatal 44071 1727204670.17960: done checking for any_errors_fatal 44071 1727204670.17960: checking for max_fail_percentage 44071 1727204670.17962: done checking for max_fail_percentage 44071 1727204670.17963: checking to see if all hosts have failed and the running result is not ok 44071 1727204670.17963: done checking to see if all hosts have failed 44071 1727204670.17964: getting the remaining hosts for this loop 44071 1727204670.17984: done getting the remaining hosts for this loop 44071 1727204670.18016: getting the next task for host managed-node2 44071 1727204670.18038: done getting next task for host managed-node2 44071 1727204670.18043: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204670.18057: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204670.18091: getting variables 44071 1727204670.18094: in VariableManager get_vars() 44071 1727204670.18222: Calling all_inventory to load vars for managed-node2 44071 1727204670.18245: Calling groups_inventory to load vars for managed-node2 44071 1727204670.18249: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204670.18259: Calling all_plugins_play to load vars for managed-node2 44071 1727204670.18262: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204670.18266: Calling groups_plugins_play to load vars for managed-node2 44071 1727204670.20086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204670.21689: done with get_vars() 44071 1727204670.21721: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:04:30 -0400 (0:00:00.741) 0:01:22.534 ***** 44071 1727204670.21800: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204670.22108: worker is 1 (out of 1 available) 44071 1727204670.22123: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204670.22141: done queuing things up, now waiting for results queue to drain 44071 1727204670.22143: waiting for pending results... 44071 1727204670.22359: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204670.22520: in run() - task 127b8e07-fff9-c964-7471-000000001470 44071 1727204670.22537: variable 'ansible_search_path' from source: unknown 44071 1727204670.22541: variable 'ansible_search_path' from source: unknown 44071 1727204670.22574: calling self._execute() 44071 1727204670.22660: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204670.22664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204670.22675: variable 'omit' from source: magic vars 44071 1727204670.23002: variable 'ansible_distribution_major_version' from source: facts 44071 1727204670.23013: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204670.23112: variable 'network_state' from source: role '' defaults 44071 1727204670.23122: Evaluated conditional (network_state != {}): False 44071 1727204670.23125: when evaluation is False, skipping this task 44071 1727204670.23130: _execute() done 44071 1727204670.23135: dumping result to json 44071 1727204670.23138: done dumping result, returning 44071 1727204670.23144: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-c964-7471-000000001470] 44071 1727204670.23156: sending task result for task 127b8e07-fff9-c964-7471-000000001470 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204670.23322: no more pending results, returning what we have 44071 1727204670.23327: results queue empty 44071 1727204670.23328: checking for any_errors_fatal 44071 1727204670.23341: done checking for any_errors_fatal 44071 1727204670.23342: checking for max_fail_percentage 44071 1727204670.23344: done checking for max_fail_percentage 44071 1727204670.23345: checking to see if all hosts have failed and the running result is not ok 44071 1727204670.23346: done checking to see if all hosts have failed 44071 1727204670.23346: getting the remaining hosts for this loop 44071 1727204670.23348: done getting the remaining hosts for this loop 44071 1727204670.23353: getting the next task for host managed-node2 44071 1727204670.23361: done getting next task for host managed-node2 44071 1727204670.23367: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204670.23375: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204670.23400: getting variables 44071 1727204670.23401: in VariableManager get_vars() 44071 1727204670.23448: Calling all_inventory to load vars for managed-node2 44071 1727204670.23452: Calling groups_inventory to load vars for managed-node2 44071 1727204670.23454: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204670.23465: Calling all_plugins_play to load vars for managed-node2 44071 1727204670.23484: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204670.23490: done sending task result for task 127b8e07-fff9-c964-7471-000000001470 44071 1727204670.23493: WORKER PROCESS EXITING 44071 1727204670.23505: Calling groups_plugins_play to load vars for managed-node2 44071 1727204670.26314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204670.30411: done with get_vars() 44071 1727204670.30843: done getting variables 44071 1727204670.30973: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:04:30 -0400 (0:00:00.092) 0:01:22.626 ***** 44071 1727204670.31048: entering _queue_task() for managed-node2/debug 44071 1727204670.31838: worker is 1 (out of 1 available) 44071 1727204670.31853: exiting _queue_task() for managed-node2/debug 44071 1727204670.31871: done queuing things up, now waiting for results queue to drain 44071 1727204670.31873: waiting for pending results... 44071 1727204670.32146: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204670.32319: in run() - task 127b8e07-fff9-c964-7471-000000001471 44071 1727204670.32368: variable 'ansible_search_path' from source: unknown 44071 1727204670.32379: variable 'ansible_search_path' from source: unknown 44071 1727204670.32425: calling self._execute() 44071 1727204670.32971: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204670.32975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204670.32977: variable 'omit' from source: magic vars 44071 1727204670.33632: variable 'ansible_distribution_major_version' from source: facts 44071 1727204670.33651: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204670.33658: variable 'omit' from source: magic vars 44071 1727204670.33783: variable 'omit' from source: magic vars 44071 1727204670.33825: variable 'omit' from source: magic vars 44071 1727204670.33926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204670.33982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204670.34005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204670.34026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204670.34043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204670.34089: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204670.34092: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204670.34096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204670.34267: Set connection var ansible_connection to ssh 44071 1727204670.34293: Set connection var ansible_timeout to 10 44071 1727204670.34313: Set connection var ansible_pipelining to False 44071 1727204670.34350: Set connection var ansible_shell_type to sh 44071 1727204670.34399: Set connection var ansible_shell_executable to /bin/sh 44071 1727204670.34421: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204670.34452: variable 'ansible_shell_executable' from source: unknown 44071 1727204670.34456: variable 'ansible_connection' from source: unknown 44071 1727204670.34459: variable 'ansible_module_compression' from source: unknown 44071 1727204670.34461: variable 'ansible_shell_type' from source: unknown 44071 1727204670.34464: variable 'ansible_shell_executable' from source: unknown 44071 1727204670.34468: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204670.34471: variable 'ansible_pipelining' from source: unknown 44071 1727204670.34473: variable 'ansible_timeout' from source: unknown 44071 1727204670.34477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204670.34606: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204670.34615: variable 'omit' from source: magic vars 44071 1727204670.34621: starting attempt loop 44071 1727204670.34624: running the handler 44071 1727204670.34814: variable '__network_connections_result' from source: set_fact 44071 1727204670.34901: handler run complete 44071 1727204670.34909: attempt loop complete, returning result 44071 1727204670.34911: _execute() done 44071 1727204670.34914: dumping result to json 44071 1727204670.34923: done dumping result, returning 44071 1727204670.34926: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-c964-7471-000000001471] 44071 1727204670.34982: sending task result for task 127b8e07-fff9-c964-7471-000000001471 44071 1727204670.35102: done sending task result for task 127b8e07-fff9-c964-7471-000000001471 44071 1727204670.35104: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 44071 1727204670.35205: no more pending results, returning what we have 44071 1727204670.35209: results queue empty 44071 1727204670.35209: checking for any_errors_fatal 44071 1727204670.35216: done checking for any_errors_fatal 44071 1727204670.35216: checking for max_fail_percentage 44071 1727204670.35218: done checking for max_fail_percentage 44071 1727204670.35219: checking to see if all hosts have failed and the running result is not ok 44071 1727204670.35220: done checking to see if all hosts have failed 44071 1727204670.35220: getting the remaining hosts for this loop 44071 1727204670.35222: done getting the remaining hosts for this loop 44071 1727204670.35226: getting the next task for host managed-node2 44071 1727204670.35236: done getting next task for host managed-node2 44071 1727204670.35240: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204670.35245: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204670.35256: getting variables 44071 1727204670.35258: in VariableManager get_vars() 44071 1727204670.35296: Calling all_inventory to load vars for managed-node2 44071 1727204670.35298: Calling groups_inventory to load vars for managed-node2 44071 1727204670.35300: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204670.35310: Calling all_plugins_play to load vars for managed-node2 44071 1727204670.35313: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204670.35320: Calling groups_plugins_play to load vars for managed-node2 44071 1727204670.36819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204670.40314: done with get_vars() 44071 1727204670.40362: done getting variables 44071 1727204670.40757: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:04:30 -0400 (0:00:00.097) 0:01:22.724 ***** 44071 1727204670.40808: entering _queue_task() for managed-node2/debug 44071 1727204670.41770: worker is 1 (out of 1 available) 44071 1727204670.41783: exiting _queue_task() for managed-node2/debug 44071 1727204670.41796: done queuing things up, now waiting for results queue to drain 44071 1727204670.41798: waiting for pending results... 44071 1727204670.42712: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204670.42718: in run() - task 127b8e07-fff9-c964-7471-000000001472 44071 1727204670.42721: variable 'ansible_search_path' from source: unknown 44071 1727204670.42724: variable 'ansible_search_path' from source: unknown 44071 1727204670.42727: calling self._execute() 44071 1727204670.42970: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204670.42976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204670.42980: variable 'omit' from source: magic vars 44071 1727204670.43796: variable 'ansible_distribution_major_version' from source: facts 44071 1727204670.43803: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204670.43810: variable 'omit' from source: magic vars 44071 1727204670.44091: variable 'omit' from source: magic vars 44071 1727204670.44136: variable 'omit' from source: magic vars 44071 1727204670.44262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204670.44393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204670.44397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204670.44399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204670.44498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204670.44558: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204670.44586: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204670.44595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204670.44715: Set connection var ansible_connection to ssh 44071 1727204670.44977: Set connection var ansible_timeout to 10 44071 1727204670.44981: Set connection var ansible_pipelining to False 44071 1727204670.44983: Set connection var ansible_shell_type to sh 44071 1727204670.44985: Set connection var ansible_shell_executable to /bin/sh 44071 1727204670.44987: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204670.44989: variable 'ansible_shell_executable' from source: unknown 44071 1727204670.44992: variable 'ansible_connection' from source: unknown 44071 1727204670.44994: variable 'ansible_module_compression' from source: unknown 44071 1727204670.44996: variable 'ansible_shell_type' from source: unknown 44071 1727204670.44998: variable 'ansible_shell_executable' from source: unknown 44071 1727204670.45000: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204670.45003: variable 'ansible_pipelining' from source: unknown 44071 1727204670.45005: variable 'ansible_timeout' from source: unknown 44071 1727204670.45007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204670.45011: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204670.45033: variable 'omit' from source: magic vars 44071 1727204670.45044: starting attempt loop 44071 1727204670.45051: running the handler 44071 1727204670.45113: variable '__network_connections_result' from source: set_fact 44071 1727204670.45213: variable '__network_connections_result' from source: set_fact 44071 1727204670.45339: handler run complete 44071 1727204670.45376: attempt loop complete, returning result 44071 1727204670.45384: _execute() done 44071 1727204670.45393: dumping result to json 44071 1727204670.45403: done dumping result, returning 44071 1727204670.45417: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-c964-7471-000000001472] 44071 1727204670.45427: sending task result for task 127b8e07-fff9-c964-7471-000000001472 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 44071 1727204670.45926: no more pending results, returning what we have 44071 1727204670.45930: results queue empty 44071 1727204670.45931: checking for any_errors_fatal 44071 1727204670.45937: done checking for any_errors_fatal 44071 1727204670.45938: checking for max_fail_percentage 44071 1727204670.45939: done checking for max_fail_percentage 44071 1727204670.45940: checking to see if all hosts have failed and the running result is not ok 44071 1727204670.45941: done checking to see if all hosts have failed 44071 1727204670.45942: getting the remaining hosts for this loop 44071 1727204670.45943: done getting the remaining hosts for this loop 44071 1727204670.45947: getting the next task for host managed-node2 44071 1727204670.45955: done getting next task for host managed-node2 44071 1727204670.45961: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204670.45969: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204670.45979: done sending task result for task 127b8e07-fff9-c964-7471-000000001472 44071 1727204670.45982: WORKER PROCESS EXITING 44071 1727204670.45993: getting variables 44071 1727204670.45994: in VariableManager get_vars() 44071 1727204670.46038: Calling all_inventory to load vars for managed-node2 44071 1727204670.46041: Calling groups_inventory to load vars for managed-node2 44071 1727204670.46043: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204670.46054: Calling all_plugins_play to load vars for managed-node2 44071 1727204670.46057: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204670.46061: Calling groups_plugins_play to load vars for managed-node2 44071 1727204670.49954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204670.53708: done with get_vars() 44071 1727204670.53754: done getting variables 44071 1727204670.53834: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:04:30 -0400 (0:00:00.130) 0:01:22.855 ***** 44071 1727204670.53877: entering _queue_task() for managed-node2/debug 44071 1727204670.54327: worker is 1 (out of 1 available) 44071 1727204670.54461: exiting _queue_task() for managed-node2/debug 44071 1727204670.54477: done queuing things up, now waiting for results queue to drain 44071 1727204670.54479: waiting for pending results... 44071 1727204670.54887: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204670.54933: in run() - task 127b8e07-fff9-c964-7471-000000001473 44071 1727204670.54954: variable 'ansible_search_path' from source: unknown 44071 1727204670.54958: variable 'ansible_search_path' from source: unknown 44071 1727204670.55005: calling self._execute() 44071 1727204670.55129: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204670.55139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204670.55150: variable 'omit' from source: magic vars 44071 1727204670.55773: variable 'ansible_distribution_major_version' from source: facts 44071 1727204670.55778: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204670.55801: variable 'network_state' from source: role '' defaults 44071 1727204670.55813: Evaluated conditional (network_state != {}): False 44071 1727204670.55816: when evaluation is False, skipping this task 44071 1727204670.55819: _execute() done 44071 1727204670.55821: dumping result to json 44071 1727204670.55824: done dumping result, returning 44071 1727204670.55836: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-c964-7471-000000001473] 44071 1727204670.55842: sending task result for task 127b8e07-fff9-c964-7471-000000001473 44071 1727204670.55963: done sending task result for task 127b8e07-fff9-c964-7471-000000001473 44071 1727204670.55968: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 44071 1727204670.56037: no more pending results, returning what we have 44071 1727204670.56042: results queue empty 44071 1727204670.56043: checking for any_errors_fatal 44071 1727204670.56056: done checking for any_errors_fatal 44071 1727204670.56057: checking for max_fail_percentage 44071 1727204670.56058: done checking for max_fail_percentage 44071 1727204670.56059: checking to see if all hosts have failed and the running result is not ok 44071 1727204670.56060: done checking to see if all hosts have failed 44071 1727204670.56061: getting the remaining hosts for this loop 44071 1727204670.56063: done getting the remaining hosts for this loop 44071 1727204670.56070: getting the next task for host managed-node2 44071 1727204670.56080: done getting next task for host managed-node2 44071 1727204670.56085: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204670.56092: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204670.56122: getting variables 44071 1727204670.56125: in VariableManager get_vars() 44071 1727204670.56295: Calling all_inventory to load vars for managed-node2 44071 1727204670.56298: Calling groups_inventory to load vars for managed-node2 44071 1727204670.56301: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204670.56315: Calling all_plugins_play to load vars for managed-node2 44071 1727204670.56318: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204670.56321: Calling groups_plugins_play to load vars for managed-node2 44071 1727204670.59436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204670.62978: done with get_vars() 44071 1727204670.63230: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:04:30 -0400 (0:00:00.095) 0:01:22.951 ***** 44071 1727204670.63474: entering _queue_task() for managed-node2/ping 44071 1727204670.64435: worker is 1 (out of 1 available) 44071 1727204670.64451: exiting _queue_task() for managed-node2/ping 44071 1727204670.64468: done queuing things up, now waiting for results queue to drain 44071 1727204670.64470: waiting for pending results... 44071 1727204670.65160: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204670.65674: in run() - task 127b8e07-fff9-c964-7471-000000001474 44071 1727204670.65679: variable 'ansible_search_path' from source: unknown 44071 1727204670.65682: variable 'ansible_search_path' from source: unknown 44071 1727204670.65685: calling self._execute() 44071 1727204670.65884: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204670.65900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204670.65917: variable 'omit' from source: magic vars 44071 1727204670.66764: variable 'ansible_distribution_major_version' from source: facts 44071 1727204670.66973: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204670.66977: variable 'omit' from source: magic vars 44071 1727204670.67079: variable 'omit' from source: magic vars 44071 1727204670.67125: variable 'omit' from source: magic vars 44071 1727204670.67271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204670.67276: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204670.67279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204670.67283: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204670.67305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204670.67343: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204670.67354: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204670.67363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204670.67490: Set connection var ansible_connection to ssh 44071 1727204670.67508: Set connection var ansible_timeout to 10 44071 1727204670.67520: Set connection var ansible_pipelining to False 44071 1727204670.67530: Set connection var ansible_shell_type to sh 44071 1727204670.67540: Set connection var ansible_shell_executable to /bin/sh 44071 1727204670.67552: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204670.67586: variable 'ansible_shell_executable' from source: unknown 44071 1727204670.67595: variable 'ansible_connection' from source: unknown 44071 1727204670.67603: variable 'ansible_module_compression' from source: unknown 44071 1727204670.67612: variable 'ansible_shell_type' from source: unknown 44071 1727204670.67624: variable 'ansible_shell_executable' from source: unknown 44071 1727204670.67631: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204670.67639: variable 'ansible_pipelining' from source: unknown 44071 1727204670.67646: variable 'ansible_timeout' from source: unknown 44071 1727204670.67728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204670.67901: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204670.67923: variable 'omit' from source: magic vars 44071 1727204670.67934: starting attempt loop 44071 1727204670.67947: running the handler 44071 1727204670.67973: _low_level_execute_command(): starting 44071 1727204670.67988: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204670.68788: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204670.68808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204670.68869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204670.68884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204670.68973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204670.68989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204670.69170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204670.71070: stdout chunk (state=3): >>>/root <<< 44071 1727204670.71187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204670.71273: stderr chunk (state=3): >>><<< 44071 1727204670.71287: stdout chunk (state=3): >>><<< 44071 1727204670.71601: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204670.71605: _low_level_execute_command(): starting 44071 1727204670.71609: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204670.7149034-48595-164384705634964 `" && echo ansible-tmp-1727204670.7149034-48595-164384705634964="` echo /root/.ansible/tmp/ansible-tmp-1727204670.7149034-48595-164384705634964 `" ) && sleep 0' 44071 1727204670.72827: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204670.73311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204670.73388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204670.73522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204670.73533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204670.73589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204670.73818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204670.75813: stdout chunk (state=3): >>>ansible-tmp-1727204670.7149034-48595-164384705634964=/root/.ansible/tmp/ansible-tmp-1727204670.7149034-48595-164384705634964 <<< 44071 1727204670.75998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204670.76126: stderr chunk (state=3): >>><<< 44071 1727204670.76134: stdout chunk (state=3): >>><<< 44071 1727204670.76158: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204670.7149034-48595-164384705634964=/root/.ansible/tmp/ansible-tmp-1727204670.7149034-48595-164384705634964 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204670.76229: variable 'ansible_module_compression' from source: unknown 44071 1727204670.76277: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44071 1727204670.76327: variable 'ansible_facts' from source: unknown 44071 1727204670.76419: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204670.7149034-48595-164384705634964/AnsiballZ_ping.py 44071 1727204670.76644: Sending initial data 44071 1727204670.76648: Sent initial data (153 bytes) 44071 1727204670.77603: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204670.77638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204670.77641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204670.77704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204670.77934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204670.79544: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204670.79668: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204670.79698: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204670.79769: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmph4r23f1f /root/.ansible/tmp/ansible-tmp-1727204670.7149034-48595-164384705634964/AnsiballZ_ping.py <<< 44071 1727204670.79804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204670.7149034-48595-164384705634964/AnsiballZ_ping.py" <<< 44071 1727204670.79868: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmph4r23f1f" to remote "/root/.ansible/tmp/ansible-tmp-1727204670.7149034-48595-164384705634964/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204670.7149034-48595-164384705634964/AnsiballZ_ping.py" <<< 44071 1727204670.80879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204670.80883: stderr chunk (state=3): >>><<< 44071 1727204670.80886: stdout chunk (state=3): >>><<< 44071 1727204670.80888: done transferring module to remote 44071 1727204670.80890: _low_level_execute_command(): starting 44071 1727204670.80893: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204670.7149034-48595-164384705634964/ /root/.ansible/tmp/ansible-tmp-1727204670.7149034-48595-164384705634964/AnsiballZ_ping.py && sleep 0' 44071 1727204670.81536: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204670.81540: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204670.81552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204670.81573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204670.81586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204670.81592: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204670.81602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204670.81617: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204670.81626: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204670.81641: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204670.81644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204670.81654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204670.81679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204670.81820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204670.81824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204670.82054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204670.84043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204670.84047: stdout chunk (state=3): >>><<< 44071 1727204670.84050: stderr chunk (state=3): >>><<< 44071 1727204670.84169: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204670.84177: _low_level_execute_command(): starting 44071 1727204670.84179: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204670.7149034-48595-164384705634964/AnsiballZ_ping.py && sleep 0' 44071 1727204670.84788: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204670.84873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204670.84887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204670.84890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204670.85008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204671.01310: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44071 1727204671.02818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204671.02823: stdout chunk (state=3): >>><<< 44071 1727204671.02825: stderr chunk (state=3): >>><<< 44071 1727204671.02918: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204671.03176: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204670.7149034-48595-164384705634964/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204671.03180: _low_level_execute_command(): starting 44071 1727204671.03183: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204670.7149034-48595-164384705634964/ > /dev/null 2>&1 && sleep 0' 44071 1727204671.04192: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204671.04235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204671.04269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204671.04327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204671.04393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204671.04424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204671.04452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204671.04560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204671.06851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204671.06858: stdout chunk (state=3): >>><<< 44071 1727204671.06861: stderr chunk (state=3): >>><<< 44071 1727204671.07076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204671.07085: handler run complete 44071 1727204671.07088: attempt loop complete, returning result 44071 1727204671.07090: _execute() done 44071 1727204671.07092: dumping result to json 44071 1727204671.07095: done dumping result, returning 44071 1727204671.07097: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-c964-7471-000000001474] 44071 1727204671.07099: sending task result for task 127b8e07-fff9-c964-7471-000000001474 44071 1727204671.07293: done sending task result for task 127b8e07-fff9-c964-7471-000000001474 44071 1727204671.07298: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 44071 1727204671.07614: no more pending results, returning what we have 44071 1727204671.07618: results queue empty 44071 1727204671.07619: checking for any_errors_fatal 44071 1727204671.07626: done checking for any_errors_fatal 44071 1727204671.07627: checking for max_fail_percentage 44071 1727204671.07629: done checking for max_fail_percentage 44071 1727204671.07630: checking to see if all hosts have failed and the running result is not ok 44071 1727204671.07631: done checking to see if all hosts have failed 44071 1727204671.07634: getting the remaining hosts for this loop 44071 1727204671.07636: done getting the remaining hosts for this loop 44071 1727204671.07640: getting the next task for host managed-node2 44071 1727204671.07652: done getting next task for host managed-node2 44071 1727204671.07655: ^ task is: TASK: meta (role_complete) 44071 1727204671.07660: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204671.07675: getting variables 44071 1727204671.07677: in VariableManager get_vars() 44071 1727204671.07723: Calling all_inventory to load vars for managed-node2 44071 1727204671.07725: Calling groups_inventory to load vars for managed-node2 44071 1727204671.07727: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204671.07742: Calling all_plugins_play to load vars for managed-node2 44071 1727204671.07745: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204671.07748: Calling groups_plugins_play to load vars for managed-node2 44071 1727204671.11563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204671.16518: done with get_vars() 44071 1727204671.16564: done getting variables 44071 1727204671.16883: done queuing things up, now waiting for results queue to drain 44071 1727204671.16886: results queue empty 44071 1727204671.16887: checking for any_errors_fatal 44071 1727204671.16891: done checking for any_errors_fatal 44071 1727204671.16891: checking for max_fail_percentage 44071 1727204671.16893: done checking for max_fail_percentage 44071 1727204671.16893: checking to see if all hosts have failed and the running result is not ok 44071 1727204671.16894: done checking to see if all hosts have failed 44071 1727204671.16895: getting the remaining hosts for this loop 44071 1727204671.16896: done getting the remaining hosts for this loop 44071 1727204671.16899: getting the next task for host managed-node2 44071 1727204671.16905: done getting next task for host managed-node2 44071 1727204671.16908: ^ task is: TASK: Asserts 44071 1727204671.16910: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204671.16913: getting variables 44071 1727204671.16914: in VariableManager get_vars() 44071 1727204671.16932: Calling all_inventory to load vars for managed-node2 44071 1727204671.16935: Calling groups_inventory to load vars for managed-node2 44071 1727204671.16938: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204671.16944: Calling all_plugins_play to load vars for managed-node2 44071 1727204671.16946: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204671.16949: Calling groups_plugins_play to load vars for managed-node2 44071 1727204671.20988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204671.25364: done with get_vars() 44071 1727204671.25408: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 15:04:31 -0400 (0:00:00.620) 0:01:23.571 ***** 44071 1727204671.25496: entering _queue_task() for managed-node2/include_tasks 44071 1727204671.26283: worker is 1 (out of 1 available) 44071 1727204671.26296: exiting _queue_task() for managed-node2/include_tasks 44071 1727204671.26309: done queuing things up, now waiting for results queue to drain 44071 1727204671.26311: waiting for pending results... 44071 1727204671.26450: running TaskExecutor() for managed-node2/TASK: Asserts 44071 1727204671.26926: in run() - task 127b8e07-fff9-c964-7471-00000000100a 44071 1727204671.26930: variable 'ansible_search_path' from source: unknown 44071 1727204671.26936: variable 'ansible_search_path' from source: unknown 44071 1727204671.27073: variable 'lsr_assert' from source: include params 44071 1727204671.27357: variable 'lsr_assert' from source: include params 44071 1727204671.27456: variable 'omit' from source: magic vars 44071 1727204671.27654: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204671.27675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204671.27692: variable 'omit' from source: magic vars 44071 1727204671.28001: variable 'ansible_distribution_major_version' from source: facts 44071 1727204671.28021: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204671.28036: variable 'item' from source: unknown 44071 1727204671.28120: variable 'item' from source: unknown 44071 1727204671.28164: variable 'item' from source: unknown 44071 1727204671.28242: variable 'item' from source: unknown 44071 1727204671.28614: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204671.28618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204671.28620: variable 'omit' from source: magic vars 44071 1727204671.28833: variable 'ansible_distribution_major_version' from source: facts 44071 1727204671.28837: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204671.28843: variable 'item' from source: unknown 44071 1727204671.28894: variable 'item' from source: unknown 44071 1727204671.28940: variable 'item' from source: unknown 44071 1727204671.29012: variable 'item' from source: unknown 44071 1727204671.29277: dumping result to json 44071 1727204671.29281: done dumping result, returning 44071 1727204671.29283: done running TaskExecutor() for managed-node2/TASK: Asserts [127b8e07-fff9-c964-7471-00000000100a] 44071 1727204671.29286: sending task result for task 127b8e07-fff9-c964-7471-00000000100a 44071 1727204671.29339: done sending task result for task 127b8e07-fff9-c964-7471-00000000100a 44071 1727204671.29342: WORKER PROCESS EXITING 44071 1727204671.29408: no more pending results, returning what we have 44071 1727204671.29414: in VariableManager get_vars() 44071 1727204671.29471: Calling all_inventory to load vars for managed-node2 44071 1727204671.29475: Calling groups_inventory to load vars for managed-node2 44071 1727204671.29479: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204671.29495: Calling all_plugins_play to load vars for managed-node2 44071 1727204671.29499: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204671.29502: Calling groups_plugins_play to load vars for managed-node2 44071 1727204671.31916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204671.36546: done with get_vars() 44071 1727204671.36589: variable 'ansible_search_path' from source: unknown 44071 1727204671.36591: variable 'ansible_search_path' from source: unknown 44071 1727204671.36760: variable 'ansible_search_path' from source: unknown 44071 1727204671.36762: variable 'ansible_search_path' from source: unknown 44071 1727204671.36885: we have included files to process 44071 1727204671.36887: generating all_blocks data 44071 1727204671.36889: done generating all_blocks data 44071 1727204671.36895: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 44071 1727204671.36896: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 44071 1727204671.36900: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 44071 1727204671.37157: in VariableManager get_vars() 44071 1727204671.37295: done with get_vars() 44071 1727204671.37431: done processing included file 44071 1727204671.37434: iterating over new_blocks loaded from include file 44071 1727204671.37435: in VariableManager get_vars() 44071 1727204671.37454: done with get_vars() 44071 1727204671.37456: filtering new block on tags 44071 1727204671.37721: done filtering new block on tags 44071 1727204671.37725: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 => (item=tasks/assert_device_present.yml) 44071 1727204671.37731: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 44071 1727204671.37732: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 44071 1727204671.37736: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 44071 1727204671.38046: in VariableManager get_vars() 44071 1727204671.38073: done with get_vars() 44071 1727204671.38555: done processing included file 44071 1727204671.38560: iterating over new_blocks loaded from include file 44071 1727204671.38562: in VariableManager get_vars() 44071 1727204671.38585: done with get_vars() 44071 1727204671.38587: filtering new block on tags 44071 1727204671.38625: done filtering new block on tags 44071 1727204671.38628: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node2 => (item=tasks/assert_profile_absent.yml) 44071 1727204671.38633: extending task lists for all hosts with included blocks 44071 1727204671.41273: done extending task lists 44071 1727204671.41275: done processing included files 44071 1727204671.41276: results queue empty 44071 1727204671.41276: checking for any_errors_fatal 44071 1727204671.41279: done checking for any_errors_fatal 44071 1727204671.41279: checking for max_fail_percentage 44071 1727204671.41281: done checking for max_fail_percentage 44071 1727204671.41281: checking to see if all hosts have failed and the running result is not ok 44071 1727204671.41282: done checking to see if all hosts have failed 44071 1727204671.41283: getting the remaining hosts for this loop 44071 1727204671.41285: done getting the remaining hosts for this loop 44071 1727204671.41288: getting the next task for host managed-node2 44071 1727204671.41293: done getting next task for host managed-node2 44071 1727204671.41295: ^ task is: TASK: Include the task 'get_interface_stat.yml' 44071 1727204671.41298: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204671.41307: getting variables 44071 1727204671.41309: in VariableManager get_vars() 44071 1727204671.41325: Calling all_inventory to load vars for managed-node2 44071 1727204671.41327: Calling groups_inventory to load vars for managed-node2 44071 1727204671.41330: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204671.41338: Calling all_plugins_play to load vars for managed-node2 44071 1727204671.41341: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204671.41344: Calling groups_plugins_play to load vars for managed-node2 44071 1727204671.44019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204671.46615: done with get_vars() 44071 1727204671.46669: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:04:31 -0400 (0:00:00.212) 0:01:23.784 ***** 44071 1727204671.46798: entering _queue_task() for managed-node2/include_tasks 44071 1727204671.47563: worker is 1 (out of 1 available) 44071 1727204671.47584: exiting _queue_task() for managed-node2/include_tasks 44071 1727204671.47598: done queuing things up, now waiting for results queue to drain 44071 1727204671.47600: waiting for pending results... 44071 1727204671.48152: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 44071 1727204671.48197: in run() - task 127b8e07-fff9-c964-7471-0000000015cf 44071 1727204671.48228: variable 'ansible_search_path' from source: unknown 44071 1727204671.48245: variable 'ansible_search_path' from source: unknown 44071 1727204671.48319: calling self._execute() 44071 1727204671.48472: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204671.48556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204671.48561: variable 'omit' from source: magic vars 44071 1727204671.49187: variable 'ansible_distribution_major_version' from source: facts 44071 1727204671.49223: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204671.49229: _execute() done 44071 1727204671.49340: dumping result to json 44071 1727204671.49344: done dumping result, returning 44071 1727204671.49347: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-c964-7471-0000000015cf] 44071 1727204671.49352: sending task result for task 127b8e07-fff9-c964-7471-0000000015cf 44071 1727204671.49599: no more pending results, returning what we have 44071 1727204671.49606: in VariableManager get_vars() 44071 1727204671.49659: Calling all_inventory to load vars for managed-node2 44071 1727204671.49666: Calling groups_inventory to load vars for managed-node2 44071 1727204671.49671: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204671.49691: Calling all_plugins_play to load vars for managed-node2 44071 1727204671.49696: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204671.49700: Calling groups_plugins_play to load vars for managed-node2 44071 1727204671.50372: done sending task result for task 127b8e07-fff9-c964-7471-0000000015cf 44071 1727204671.50377: WORKER PROCESS EXITING 44071 1727204671.61137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204671.64467: done with get_vars() 44071 1727204671.64508: variable 'ansible_search_path' from source: unknown 44071 1727204671.64509: variable 'ansible_search_path' from source: unknown 44071 1727204671.64522: variable 'item' from source: include params 44071 1727204671.64627: variable 'item' from source: include params 44071 1727204671.64668: we have included files to process 44071 1727204671.64670: generating all_blocks data 44071 1727204671.64671: done generating all_blocks data 44071 1727204671.64672: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204671.64673: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204671.64679: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204671.64881: done processing included file 44071 1727204671.64884: iterating over new_blocks loaded from include file 44071 1727204671.64885: in VariableManager get_vars() 44071 1727204671.64909: done with get_vars() 44071 1727204671.64911: filtering new block on tags 44071 1727204671.64943: done filtering new block on tags 44071 1727204671.64948: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 44071 1727204671.64956: extending task lists for all hosts with included blocks 44071 1727204671.65645: done extending task lists 44071 1727204671.65647: done processing included files 44071 1727204671.65648: results queue empty 44071 1727204671.65648: checking for any_errors_fatal 44071 1727204671.65652: done checking for any_errors_fatal 44071 1727204671.65653: checking for max_fail_percentage 44071 1727204671.65654: done checking for max_fail_percentage 44071 1727204671.65655: checking to see if all hosts have failed and the running result is not ok 44071 1727204671.65655: done checking to see if all hosts have failed 44071 1727204671.65656: getting the remaining hosts for this loop 44071 1727204671.65658: done getting the remaining hosts for this loop 44071 1727204671.65660: getting the next task for host managed-node2 44071 1727204671.65869: done getting next task for host managed-node2 44071 1727204671.65873: ^ task is: TASK: Get stat for interface {{ interface }} 44071 1727204671.65876: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204671.65880: getting variables 44071 1727204671.65881: in VariableManager get_vars() 44071 1727204671.65896: Calling all_inventory to load vars for managed-node2 44071 1727204671.65899: Calling groups_inventory to load vars for managed-node2 44071 1727204671.65902: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204671.65909: Calling all_plugins_play to load vars for managed-node2 44071 1727204671.65912: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204671.65915: Calling groups_plugins_play to load vars for managed-node2 44071 1727204671.69529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204671.75384: done with get_vars() 44071 1727204671.75540: done getting variables 44071 1727204671.75808: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:04:31 -0400 (0:00:00.290) 0:01:24.074 ***** 44071 1727204671.75840: entering _queue_task() for managed-node2/stat 44071 1727204671.76676: worker is 1 (out of 1 available) 44071 1727204671.76691: exiting _queue_task() for managed-node2/stat 44071 1727204671.76706: done queuing things up, now waiting for results queue to drain 44071 1727204671.76707: waiting for pending results... 44071 1727204671.77392: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 44071 1727204671.77832: in run() - task 127b8e07-fff9-c964-7471-000000001647 44071 1727204671.77847: variable 'ansible_search_path' from source: unknown 44071 1727204671.77855: variable 'ansible_search_path' from source: unknown 44071 1727204671.77939: calling self._execute() 44071 1727204671.78199: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204671.78214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204671.78283: variable 'omit' from source: magic vars 44071 1727204671.79216: variable 'ansible_distribution_major_version' from source: facts 44071 1727204671.79312: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204671.79317: variable 'omit' from source: magic vars 44071 1727204671.79452: variable 'omit' from source: magic vars 44071 1727204671.79713: variable 'interface' from source: play vars 44071 1727204671.79771: variable 'omit' from source: magic vars 44071 1727204671.79909: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204671.80018: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204671.80050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204671.80095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204671.80140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204671.80217: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204671.80341: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204671.80345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204671.80537: Set connection var ansible_connection to ssh 44071 1727204671.80571: Set connection var ansible_timeout to 10 44071 1727204671.80666: Set connection var ansible_pipelining to False 44071 1727204671.80669: Set connection var ansible_shell_type to sh 44071 1727204671.80674: Set connection var ansible_shell_executable to /bin/sh 44071 1727204671.80676: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204671.80740: variable 'ansible_shell_executable' from source: unknown 44071 1727204671.80749: variable 'ansible_connection' from source: unknown 44071 1727204671.80757: variable 'ansible_module_compression' from source: unknown 44071 1727204671.80764: variable 'ansible_shell_type' from source: unknown 44071 1727204671.80778: variable 'ansible_shell_executable' from source: unknown 44071 1727204671.80785: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204671.80793: variable 'ansible_pipelining' from source: unknown 44071 1727204671.80883: variable 'ansible_timeout' from source: unknown 44071 1727204671.80887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204671.81577: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204671.81582: variable 'omit' from source: magic vars 44071 1727204671.81586: starting attempt loop 44071 1727204671.81588: running the handler 44071 1727204671.81591: _low_level_execute_command(): starting 44071 1727204671.81593: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204671.83069: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204671.83227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204671.83340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204671.83600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204671.83757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204671.85521: stdout chunk (state=3): >>>/root <<< 44071 1727204671.85723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204671.85778: stderr chunk (state=3): >>><<< 44071 1727204671.85895: stdout chunk (state=3): >>><<< 44071 1727204671.85989: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204671.85993: _low_level_execute_command(): starting 44071 1727204671.85996: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204671.8591642-48651-260348064371218 `" && echo ansible-tmp-1727204671.8591642-48651-260348064371218="` echo /root/.ansible/tmp/ansible-tmp-1727204671.8591642-48651-260348064371218 `" ) && sleep 0' 44071 1727204671.87278: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204671.87437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204671.87513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204671.87654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204671.87677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204671.87821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204671.90005: stdout chunk (state=3): >>>ansible-tmp-1727204671.8591642-48651-260348064371218=/root/.ansible/tmp/ansible-tmp-1727204671.8591642-48651-260348064371218 <<< 44071 1727204671.90092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204671.90363: stderr chunk (state=3): >>><<< 44071 1727204671.90369: stdout chunk (state=3): >>><<< 44071 1727204671.90373: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204671.8591642-48651-260348064371218=/root/.ansible/tmp/ansible-tmp-1727204671.8591642-48651-260348064371218 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204671.90376: variable 'ansible_module_compression' from source: unknown 44071 1727204671.90521: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44071 1727204671.90717: variable 'ansible_facts' from source: unknown 44071 1727204671.90802: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204671.8591642-48651-260348064371218/AnsiballZ_stat.py 44071 1727204671.91205: Sending initial data 44071 1727204671.91208: Sent initial data (153 bytes) 44071 1727204671.92079: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204671.92232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204671.92421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204671.92515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204671.94141: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204671.94275: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204671.94398: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpoc9yz4ig /root/.ansible/tmp/ansible-tmp-1727204671.8591642-48651-260348064371218/AnsiballZ_stat.py <<< 44071 1727204671.94402: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204671.8591642-48651-260348064371218/AnsiballZ_stat.py" <<< 44071 1727204671.94735: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpoc9yz4ig" to remote "/root/.ansible/tmp/ansible-tmp-1727204671.8591642-48651-260348064371218/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204671.8591642-48651-260348064371218/AnsiballZ_stat.py" <<< 44071 1727204671.96764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204671.96872: stderr chunk (state=3): >>><<< 44071 1727204671.96876: stdout chunk (state=3): >>><<< 44071 1727204671.96878: done transferring module to remote 44071 1727204671.96881: _low_level_execute_command(): starting 44071 1727204671.96883: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204671.8591642-48651-260348064371218/ /root/.ansible/tmp/ansible-tmp-1727204671.8591642-48651-260348064371218/AnsiballZ_stat.py && sleep 0' 44071 1727204671.98475: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204671.98487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204671.98621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204671.98686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204672.00577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204672.00974: stderr chunk (state=3): >>><<< 44071 1727204672.00978: stdout chunk (state=3): >>><<< 44071 1727204672.00981: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204672.00989: _low_level_execute_command(): starting 44071 1727204672.00991: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204671.8591642-48651-260348064371218/AnsiballZ_stat.py && sleep 0' 44071 1727204672.02540: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204672.02775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204672.03142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204672.03145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204672.19334: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 37593, "dev": 23, "nlink": 1, "atime": 1727204652.0156903, "mtime": 1727204652.0156903, "ctime": 1727204652.0156903, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44071 1727204672.20706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204672.20711: stdout chunk (state=3): >>><<< 44071 1727204672.20713: stderr chunk (state=3): >>><<< 44071 1727204672.20738: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 37593, "dev": 23, "nlink": 1, "atime": 1727204652.0156903, "mtime": 1727204652.0156903, "ctime": 1727204652.0156903, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204672.20802: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204671.8591642-48651-260348064371218/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204672.20813: _low_level_execute_command(): starting 44071 1727204672.20820: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204671.8591642-48651-260348064371218/ > /dev/null 2>&1 && sleep 0' 44071 1727204672.21657: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204672.22073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204672.22078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204672.22102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204672.24017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204672.24177: stderr chunk (state=3): >>><<< 44071 1727204672.24181: stdout chunk (state=3): >>><<< 44071 1727204672.24184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204672.24193: handler run complete 44071 1727204672.24200: attempt loop complete, returning result 44071 1727204672.24202: _execute() done 44071 1727204672.24208: dumping result to json 44071 1727204672.24214: done dumping result, returning 44071 1727204672.24223: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [127b8e07-fff9-c964-7471-000000001647] 44071 1727204672.24228: sending task result for task 127b8e07-fff9-c964-7471-000000001647 44071 1727204672.24374: done sending task result for task 127b8e07-fff9-c964-7471-000000001647 44071 1727204672.24377: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204652.0156903, "block_size": 4096, "blocks": 0, "ctime": 1727204652.0156903, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 37593, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1727204652.0156903, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 44071 1727204672.24602: no more pending results, returning what we have 44071 1727204672.24607: results queue empty 44071 1727204672.24608: checking for any_errors_fatal 44071 1727204672.24610: done checking for any_errors_fatal 44071 1727204672.24610: checking for max_fail_percentage 44071 1727204672.24612: done checking for max_fail_percentage 44071 1727204672.24613: checking to see if all hosts have failed and the running result is not ok 44071 1727204672.24614: done checking to see if all hosts have failed 44071 1727204672.24614: getting the remaining hosts for this loop 44071 1727204672.24616: done getting the remaining hosts for this loop 44071 1727204672.24620: getting the next task for host managed-node2 44071 1727204672.24629: done getting next task for host managed-node2 44071 1727204672.24634: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 44071 1727204672.24637: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204672.24642: getting variables 44071 1727204672.24643: in VariableManager get_vars() 44071 1727204672.24684: Calling all_inventory to load vars for managed-node2 44071 1727204672.24687: Calling groups_inventory to load vars for managed-node2 44071 1727204672.24691: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204672.24702: Calling all_plugins_play to load vars for managed-node2 44071 1727204672.24704: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204672.24707: Calling groups_plugins_play to load vars for managed-node2 44071 1727204672.26660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204672.29548: done with get_vars() 44071 1727204672.29595: done getting variables 44071 1727204672.29667: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204672.29819: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:04:32 -0400 (0:00:00.540) 0:01:24.615 ***** 44071 1727204672.29859: entering _queue_task() for managed-node2/assert 44071 1727204672.30304: worker is 1 (out of 1 available) 44071 1727204672.30317: exiting _queue_task() for managed-node2/assert 44071 1727204672.30447: done queuing things up, now waiting for results queue to drain 44071 1727204672.30449: waiting for pending results... 44071 1727204672.30789: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'statebr' 44071 1727204672.30916: in run() - task 127b8e07-fff9-c964-7471-0000000015d0 44071 1727204672.30973: variable 'ansible_search_path' from source: unknown 44071 1727204672.30977: variable 'ansible_search_path' from source: unknown 44071 1727204672.31006: calling self._execute() 44071 1727204672.31144: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204672.31210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204672.31216: variable 'omit' from source: magic vars 44071 1727204672.31654: variable 'ansible_distribution_major_version' from source: facts 44071 1727204672.31678: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204672.31693: variable 'omit' from source: magic vars 44071 1727204672.31758: variable 'omit' from source: magic vars 44071 1727204672.31891: variable 'interface' from source: play vars 44071 1727204672.31976: variable 'omit' from source: magic vars 44071 1727204672.31980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204672.32017: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204672.32049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204672.32076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204672.32171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204672.32174: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204672.32178: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204672.32181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204672.32280: Set connection var ansible_connection to ssh 44071 1727204672.32298: Set connection var ansible_timeout to 10 44071 1727204672.32313: Set connection var ansible_pipelining to False 44071 1727204672.32324: Set connection var ansible_shell_type to sh 44071 1727204672.32335: Set connection var ansible_shell_executable to /bin/sh 44071 1727204672.32348: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204672.32412: variable 'ansible_shell_executable' from source: unknown 44071 1727204672.32417: variable 'ansible_connection' from source: unknown 44071 1727204672.32419: variable 'ansible_module_compression' from source: unknown 44071 1727204672.32421: variable 'ansible_shell_type' from source: unknown 44071 1727204672.32423: variable 'ansible_shell_executable' from source: unknown 44071 1727204672.32426: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204672.32428: variable 'ansible_pipelining' from source: unknown 44071 1727204672.32430: variable 'ansible_timeout' from source: unknown 44071 1727204672.32436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204672.32631: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204672.32638: variable 'omit' from source: magic vars 44071 1727204672.32648: starting attempt loop 44071 1727204672.32672: running the handler 44071 1727204672.32833: variable 'interface_stat' from source: set_fact 44071 1727204672.32873: Evaluated conditional (interface_stat.stat.exists): True 44071 1727204672.32957: handler run complete 44071 1727204672.32961: attempt loop complete, returning result 44071 1727204672.32963: _execute() done 44071 1727204672.32969: dumping result to json 44071 1727204672.32972: done dumping result, returning 44071 1727204672.32974: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'statebr' [127b8e07-fff9-c964-7471-0000000015d0] 44071 1727204672.32976: sending task result for task 127b8e07-fff9-c964-7471-0000000015d0 44071 1727204672.33179: done sending task result for task 127b8e07-fff9-c964-7471-0000000015d0 44071 1727204672.33182: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204672.33237: no more pending results, returning what we have 44071 1727204672.33241: results queue empty 44071 1727204672.33242: checking for any_errors_fatal 44071 1727204672.33258: done checking for any_errors_fatal 44071 1727204672.33259: checking for max_fail_percentage 44071 1727204672.33261: done checking for max_fail_percentage 44071 1727204672.33262: checking to see if all hosts have failed and the running result is not ok 44071 1727204672.33262: done checking to see if all hosts have failed 44071 1727204672.33263: getting the remaining hosts for this loop 44071 1727204672.33267: done getting the remaining hosts for this loop 44071 1727204672.33273: getting the next task for host managed-node2 44071 1727204672.33284: done getting next task for host managed-node2 44071 1727204672.33289: ^ task is: TASK: Include the task 'get_profile_stat.yml' 44071 1727204672.33471: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204672.33477: getting variables 44071 1727204672.33478: in VariableManager get_vars() 44071 1727204672.33514: Calling all_inventory to load vars for managed-node2 44071 1727204672.33517: Calling groups_inventory to load vars for managed-node2 44071 1727204672.33521: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204672.33532: Calling all_plugins_play to load vars for managed-node2 44071 1727204672.33535: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204672.33538: Calling groups_plugins_play to load vars for managed-node2 44071 1727204672.35526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204672.37820: done with get_vars() 44071 1727204672.37862: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 15:04:32 -0400 (0:00:00.081) 0:01:24.696 ***** 44071 1727204672.37981: entering _queue_task() for managed-node2/include_tasks 44071 1727204672.38525: worker is 1 (out of 1 available) 44071 1727204672.38540: exiting _queue_task() for managed-node2/include_tasks 44071 1727204672.38554: done queuing things up, now waiting for results queue to drain 44071 1727204672.38556: waiting for pending results... 44071 1727204672.38868: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 44071 1727204672.38968: in run() - task 127b8e07-fff9-c964-7471-0000000015d4 44071 1727204672.38990: variable 'ansible_search_path' from source: unknown 44071 1727204672.39001: variable 'ansible_search_path' from source: unknown 44071 1727204672.39047: calling self._execute() 44071 1727204672.39193: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204672.39207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204672.39272: variable 'omit' from source: magic vars 44071 1727204672.39706: variable 'ansible_distribution_major_version' from source: facts 44071 1727204672.39736: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204672.39749: _execute() done 44071 1727204672.39759: dumping result to json 44071 1727204672.39769: done dumping result, returning 44071 1727204672.39781: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-c964-7471-0000000015d4] 44071 1727204672.39792: sending task result for task 127b8e07-fff9-c964-7471-0000000015d4 44071 1727204672.40029: done sending task result for task 127b8e07-fff9-c964-7471-0000000015d4 44071 1727204672.40033: WORKER PROCESS EXITING 44071 1727204672.40075: no more pending results, returning what we have 44071 1727204672.40081: in VariableManager get_vars() 44071 1727204672.40133: Calling all_inventory to load vars for managed-node2 44071 1727204672.40137: Calling groups_inventory to load vars for managed-node2 44071 1727204672.40141: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204672.40381: Calling all_plugins_play to load vars for managed-node2 44071 1727204672.40385: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204672.40388: Calling groups_plugins_play to load vars for managed-node2 44071 1727204672.44382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204672.49176: done with get_vars() 44071 1727204672.49223: variable 'ansible_search_path' from source: unknown 44071 1727204672.49225: variable 'ansible_search_path' from source: unknown 44071 1727204672.49237: variable 'item' from source: include params 44071 1727204672.49504: variable 'item' from source: include params 44071 1727204672.49551: we have included files to process 44071 1727204672.49552: generating all_blocks data 44071 1727204672.49554: done generating all_blocks data 44071 1727204672.49562: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204672.49564: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204672.49570: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204672.50717: done processing included file 44071 1727204672.50720: iterating over new_blocks loaded from include file 44071 1727204672.50721: in VariableManager get_vars() 44071 1727204672.50745: done with get_vars() 44071 1727204672.50747: filtering new block on tags 44071 1727204672.50836: done filtering new block on tags 44071 1727204672.50840: in VariableManager get_vars() 44071 1727204672.50859: done with get_vars() 44071 1727204672.50861: filtering new block on tags 44071 1727204672.50933: done filtering new block on tags 44071 1727204672.50936: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 44071 1727204672.50942: extending task lists for all hosts with included blocks 44071 1727204672.51263: done extending task lists 44071 1727204672.51265: done processing included files 44071 1727204672.51267: results queue empty 44071 1727204672.51268: checking for any_errors_fatal 44071 1727204672.51272: done checking for any_errors_fatal 44071 1727204672.51272: checking for max_fail_percentage 44071 1727204672.51273: done checking for max_fail_percentage 44071 1727204672.51274: checking to see if all hosts have failed and the running result is not ok 44071 1727204672.51275: done checking to see if all hosts have failed 44071 1727204672.51276: getting the remaining hosts for this loop 44071 1727204672.51277: done getting the remaining hosts for this loop 44071 1727204672.51280: getting the next task for host managed-node2 44071 1727204672.51285: done getting next task for host managed-node2 44071 1727204672.51287: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 44071 1727204672.51290: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204672.51293: getting variables 44071 1727204672.51294: in VariableManager get_vars() 44071 1727204672.51307: Calling all_inventory to load vars for managed-node2 44071 1727204672.51309: Calling groups_inventory to load vars for managed-node2 44071 1727204672.51311: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204672.51318: Calling all_plugins_play to load vars for managed-node2 44071 1727204672.51321: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204672.51324: Calling groups_plugins_play to load vars for managed-node2 44071 1727204672.53000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204672.55186: done with get_vars() 44071 1727204672.55228: done getting variables 44071 1727204672.55285: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:04:32 -0400 (0:00:00.173) 0:01:24.869 ***** 44071 1727204672.55319: entering _queue_task() for managed-node2/set_fact 44071 1727204672.55903: worker is 1 (out of 1 available) 44071 1727204672.55916: exiting _queue_task() for managed-node2/set_fact 44071 1727204672.55929: done queuing things up, now waiting for results queue to drain 44071 1727204672.55931: waiting for pending results... 44071 1727204672.56285: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 44071 1727204672.56304: in run() - task 127b8e07-fff9-c964-7471-000000001665 44071 1727204672.56329: variable 'ansible_search_path' from source: unknown 44071 1727204672.56338: variable 'ansible_search_path' from source: unknown 44071 1727204672.56394: calling self._execute() 44071 1727204672.56519: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204672.56532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204672.56548: variable 'omit' from source: magic vars 44071 1727204672.57017: variable 'ansible_distribution_major_version' from source: facts 44071 1727204672.57047: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204672.57138: variable 'omit' from source: magic vars 44071 1727204672.57144: variable 'omit' from source: magic vars 44071 1727204672.57247: variable 'omit' from source: magic vars 44071 1727204672.57251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204672.57293: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204672.57322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204672.57352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204672.57379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204672.57414: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204672.57423: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204672.57432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204672.57555: Set connection var ansible_connection to ssh 44071 1727204672.57667: Set connection var ansible_timeout to 10 44071 1727204672.57673: Set connection var ansible_pipelining to False 44071 1727204672.57678: Set connection var ansible_shell_type to sh 44071 1727204672.57680: Set connection var ansible_shell_executable to /bin/sh 44071 1727204672.57682: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204672.57685: variable 'ansible_shell_executable' from source: unknown 44071 1727204672.57687: variable 'ansible_connection' from source: unknown 44071 1727204672.57690: variable 'ansible_module_compression' from source: unknown 44071 1727204672.57692: variable 'ansible_shell_type' from source: unknown 44071 1727204672.57695: variable 'ansible_shell_executable' from source: unknown 44071 1727204672.57697: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204672.57699: variable 'ansible_pipelining' from source: unknown 44071 1727204672.57701: variable 'ansible_timeout' from source: unknown 44071 1727204672.57703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204672.57930: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204672.57934: variable 'omit' from source: magic vars 44071 1727204672.57937: starting attempt loop 44071 1727204672.57939: running the handler 44071 1727204672.57941: handler run complete 44071 1727204672.57973: attempt loop complete, returning result 44071 1727204672.57976: _execute() done 44071 1727204672.57979: dumping result to json 44071 1727204672.57982: done dumping result, returning 44071 1727204672.58042: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-c964-7471-000000001665] 44071 1727204672.58046: sending task result for task 127b8e07-fff9-c964-7471-000000001665 ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 44071 1727204672.58207: no more pending results, returning what we have 44071 1727204672.58211: results queue empty 44071 1727204672.58213: checking for any_errors_fatal 44071 1727204672.58214: done checking for any_errors_fatal 44071 1727204672.58215: checking for max_fail_percentage 44071 1727204672.58217: done checking for max_fail_percentage 44071 1727204672.58218: checking to see if all hosts have failed and the running result is not ok 44071 1727204672.58219: done checking to see if all hosts have failed 44071 1727204672.58220: getting the remaining hosts for this loop 44071 1727204672.58221: done getting the remaining hosts for this loop 44071 1727204672.58227: getting the next task for host managed-node2 44071 1727204672.58239: done getting next task for host managed-node2 44071 1727204672.58242: ^ task is: TASK: Stat profile file 44071 1727204672.58250: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204672.58255: getting variables 44071 1727204672.58257: in VariableManager get_vars() 44071 1727204672.58304: Calling all_inventory to load vars for managed-node2 44071 1727204672.58307: Calling groups_inventory to load vars for managed-node2 44071 1727204672.58311: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204672.58325: Calling all_plugins_play to load vars for managed-node2 44071 1727204672.58329: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204672.58332: Calling groups_plugins_play to load vars for managed-node2 44071 1727204672.58983: done sending task result for task 127b8e07-fff9-c964-7471-000000001665 44071 1727204672.58987: WORKER PROCESS EXITING 44071 1727204672.60895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204672.64195: done with get_vars() 44071 1727204672.64240: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:04:32 -0400 (0:00:00.090) 0:01:24.960 ***** 44071 1727204672.64360: entering _queue_task() for managed-node2/stat 44071 1727204672.64877: worker is 1 (out of 1 available) 44071 1727204672.64891: exiting _queue_task() for managed-node2/stat 44071 1727204672.64907: done queuing things up, now waiting for results queue to drain 44071 1727204672.64908: waiting for pending results... 44071 1727204672.65431: running TaskExecutor() for managed-node2/TASK: Stat profile file 44071 1727204672.65770: in run() - task 127b8e07-fff9-c964-7471-000000001666 44071 1727204672.65786: variable 'ansible_search_path' from source: unknown 44071 1727204672.65795: variable 'ansible_search_path' from source: unknown 44071 1727204672.65843: calling self._execute() 44071 1727204672.66270: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204672.66594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204672.66598: variable 'omit' from source: magic vars 44071 1727204672.67579: variable 'ansible_distribution_major_version' from source: facts 44071 1727204672.67607: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204672.67623: variable 'omit' from source: magic vars 44071 1727204672.67873: variable 'omit' from source: magic vars 44071 1727204672.68005: variable 'profile' from source: play vars 44071 1727204672.68034: variable 'interface' from source: play vars 44071 1727204672.68213: variable 'interface' from source: play vars 44071 1727204672.68463: variable 'omit' from source: magic vars 44071 1727204672.68469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204672.68473: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204672.68585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204672.68612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204672.68631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204672.68673: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204672.68688: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204672.68791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204672.69226: Set connection var ansible_connection to ssh 44071 1727204672.69229: Set connection var ansible_timeout to 10 44071 1727204672.69231: Set connection var ansible_pipelining to False 44071 1727204672.69233: Set connection var ansible_shell_type to sh 44071 1727204672.69235: Set connection var ansible_shell_executable to /bin/sh 44071 1727204672.69238: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204672.69239: variable 'ansible_shell_executable' from source: unknown 44071 1727204672.69241: variable 'ansible_connection' from source: unknown 44071 1727204672.69243: variable 'ansible_module_compression' from source: unknown 44071 1727204672.69245: variable 'ansible_shell_type' from source: unknown 44071 1727204672.69247: variable 'ansible_shell_executable' from source: unknown 44071 1727204672.69248: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204672.69250: variable 'ansible_pipelining' from source: unknown 44071 1727204672.69252: variable 'ansible_timeout' from source: unknown 44071 1727204672.69254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204672.69789: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204672.69811: variable 'omit' from source: magic vars 44071 1727204672.69822: starting attempt loop 44071 1727204672.69830: running the handler 44071 1727204672.69851: _low_level_execute_command(): starting 44071 1727204672.69883: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204672.71360: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204672.71384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204672.71572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204672.71642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204672.71771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204672.73530: stdout chunk (state=3): >>>/root <<< 44071 1727204672.73734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204672.73749: stdout chunk (state=3): >>><<< 44071 1727204672.73767: stderr chunk (state=3): >>><<< 44071 1727204672.73793: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204672.73818: _low_level_execute_command(): starting 44071 1727204672.73830: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204672.7380152-48736-112476756411817 `" && echo ansible-tmp-1727204672.7380152-48736-112476756411817="` echo /root/.ansible/tmp/ansible-tmp-1727204672.7380152-48736-112476756411817 `" ) && sleep 0' 44071 1727204672.74480: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204672.74494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204672.74511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204672.74532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204672.74550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204672.74628: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204672.74644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204672.74682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204672.74700: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204672.74726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204672.74841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204672.76863: stdout chunk (state=3): >>>ansible-tmp-1727204672.7380152-48736-112476756411817=/root/.ansible/tmp/ansible-tmp-1727204672.7380152-48736-112476756411817 <<< 44071 1727204672.77401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204672.77404: stdout chunk (state=3): >>><<< 44071 1727204672.77407: stderr chunk (state=3): >>><<< 44071 1727204672.77410: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204672.7380152-48736-112476756411817=/root/.ansible/tmp/ansible-tmp-1727204672.7380152-48736-112476756411817 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204672.77413: variable 'ansible_module_compression' from source: unknown 44071 1727204672.77415: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44071 1727204672.77418: variable 'ansible_facts' from source: unknown 44071 1727204672.77457: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204672.7380152-48736-112476756411817/AnsiballZ_stat.py 44071 1727204672.77614: Sending initial data 44071 1727204672.77618: Sent initial data (153 bytes) 44071 1727204672.78477: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204672.78506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204672.78626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204672.80302: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204672.80358: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204672.80455: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmptylnf4uc /root/.ansible/tmp/ansible-tmp-1727204672.7380152-48736-112476756411817/AnsiballZ_stat.py <<< 44071 1727204672.80464: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204672.7380152-48736-112476756411817/AnsiballZ_stat.py" <<< 44071 1727204672.80583: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmptylnf4uc" to remote "/root/.ansible/tmp/ansible-tmp-1727204672.7380152-48736-112476756411817/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204672.7380152-48736-112476756411817/AnsiballZ_stat.py" <<< 44071 1727204672.82348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204672.82386: stderr chunk (state=3): >>><<< 44071 1727204672.82390: stdout chunk (state=3): >>><<< 44071 1727204672.82425: done transferring module to remote 44071 1727204672.82534: _low_level_execute_command(): starting 44071 1727204672.82538: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204672.7380152-48736-112476756411817/ /root/.ansible/tmp/ansible-tmp-1727204672.7380152-48736-112476756411817/AnsiballZ_stat.py && sleep 0' 44071 1727204672.83774: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204672.83973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204672.83982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204672.84045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204672.85902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204672.86089: stderr chunk (state=3): >>><<< 44071 1727204672.86093: stdout chunk (state=3): >>><<< 44071 1727204672.86122: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204672.86126: _low_level_execute_command(): starting 44071 1727204672.86128: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204672.7380152-48736-112476756411817/AnsiballZ_stat.py && sleep 0' 44071 1727204672.87596: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204672.87602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204672.87618: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204672.87698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204672.87751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204672.87754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204672.87875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204672.87929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204672.88072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204673.04426: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44071 1727204673.05761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204673.05770: stderr chunk (state=3): >>><<< 44071 1727204673.05772: stdout chunk (state=3): >>><<< 44071 1727204673.05973: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204673.05977: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204672.7380152-48736-112476756411817/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204673.05982: _low_level_execute_command(): starting 44071 1727204673.05984: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204672.7380152-48736-112476756411817/ > /dev/null 2>&1 && sleep 0' 44071 1727204673.06867: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204673.07041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204673.07115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204673.09563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204673.09595: stdout chunk (state=3): >>><<< 44071 1727204673.09598: stderr chunk (state=3): >>><<< 44071 1727204673.09601: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204673.09604: handler run complete 44071 1727204673.09606: attempt loop complete, returning result 44071 1727204673.09608: _execute() done 44071 1727204673.09610: dumping result to json 44071 1727204673.09613: done dumping result, returning 44071 1727204673.09618: done running TaskExecutor() for managed-node2/TASK: Stat profile file [127b8e07-fff9-c964-7471-000000001666] 44071 1727204673.09621: sending task result for task 127b8e07-fff9-c964-7471-000000001666 ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 44071 1727204673.09907: no more pending results, returning what we have 44071 1727204673.09912: results queue empty 44071 1727204673.09912: checking for any_errors_fatal 44071 1727204673.09922: done checking for any_errors_fatal 44071 1727204673.09923: checking for max_fail_percentage 44071 1727204673.09925: done checking for max_fail_percentage 44071 1727204673.09926: checking to see if all hosts have failed and the running result is not ok 44071 1727204673.09926: done checking to see if all hosts have failed 44071 1727204673.09927: getting the remaining hosts for this loop 44071 1727204673.09929: done getting the remaining hosts for this loop 44071 1727204673.09936: getting the next task for host managed-node2 44071 1727204673.09945: done getting next task for host managed-node2 44071 1727204673.09948: ^ task is: TASK: Set NM profile exist flag based on the profile files 44071 1727204673.09954: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204673.09958: getting variables 44071 1727204673.09960: in VariableManager get_vars() 44071 1727204673.10003: Calling all_inventory to load vars for managed-node2 44071 1727204673.10006: Calling groups_inventory to load vars for managed-node2 44071 1727204673.10009: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204673.10023: Calling all_plugins_play to load vars for managed-node2 44071 1727204673.10026: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204673.10029: Calling groups_plugins_play to load vars for managed-node2 44071 1727204673.10626: done sending task result for task 127b8e07-fff9-c964-7471-000000001666 44071 1727204673.10633: WORKER PROCESS EXITING 44071 1727204673.14882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204673.18103: done with get_vars() 44071 1727204673.18146: done getting variables 44071 1727204673.18224: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:04:33 -0400 (0:00:00.539) 0:01:25.499 ***** 44071 1727204673.18269: entering _queue_task() for managed-node2/set_fact 44071 1727204673.18711: worker is 1 (out of 1 available) 44071 1727204673.18725: exiting _queue_task() for managed-node2/set_fact 44071 1727204673.18740: done queuing things up, now waiting for results queue to drain 44071 1727204673.18743: waiting for pending results... 44071 1727204673.19093: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 44071 1727204673.19262: in run() - task 127b8e07-fff9-c964-7471-000000001667 44071 1727204673.19287: variable 'ansible_search_path' from source: unknown 44071 1727204673.19296: variable 'ansible_search_path' from source: unknown 44071 1727204673.19346: calling self._execute() 44071 1727204673.19460: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204673.19537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204673.19542: variable 'omit' from source: magic vars 44071 1727204673.19957: variable 'ansible_distribution_major_version' from source: facts 44071 1727204673.19983: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204673.20126: variable 'profile_stat' from source: set_fact 44071 1727204673.20148: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204673.20160: when evaluation is False, skipping this task 44071 1727204673.20200: _execute() done 44071 1727204673.20203: dumping result to json 44071 1727204673.20206: done dumping result, returning 44071 1727204673.20208: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-c964-7471-000000001667] 44071 1727204673.20211: sending task result for task 127b8e07-fff9-c964-7471-000000001667 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204673.20522: no more pending results, returning what we have 44071 1727204673.20528: results queue empty 44071 1727204673.20529: checking for any_errors_fatal 44071 1727204673.20544: done checking for any_errors_fatal 44071 1727204673.20545: checking for max_fail_percentage 44071 1727204673.20547: done checking for max_fail_percentage 44071 1727204673.20548: checking to see if all hosts have failed and the running result is not ok 44071 1727204673.20549: done checking to see if all hosts have failed 44071 1727204673.20549: getting the remaining hosts for this loop 44071 1727204673.20551: done getting the remaining hosts for this loop 44071 1727204673.20557: getting the next task for host managed-node2 44071 1727204673.20569: done getting next task for host managed-node2 44071 1727204673.20572: ^ task is: TASK: Get NM profile info 44071 1727204673.20580: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204673.20587: getting variables 44071 1727204673.20588: in VariableManager get_vars() 44071 1727204673.20632: Calling all_inventory to load vars for managed-node2 44071 1727204673.20635: Calling groups_inventory to load vars for managed-node2 44071 1727204673.20640: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204673.20658: Calling all_plugins_play to load vars for managed-node2 44071 1727204673.20662: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204673.20873: Calling groups_plugins_play to load vars for managed-node2 44071 1727204673.21585: done sending task result for task 127b8e07-fff9-c964-7471-000000001667 44071 1727204673.21589: WORKER PROCESS EXITING 44071 1727204673.22931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204673.26797: done with get_vars() 44071 1727204673.26853: done getting variables 44071 1727204673.27125: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:04:33 -0400 (0:00:00.088) 0:01:25.588 ***** 44071 1727204673.27165: entering _queue_task() for managed-node2/shell 44071 1727204673.27793: worker is 1 (out of 1 available) 44071 1727204673.27810: exiting _queue_task() for managed-node2/shell 44071 1727204673.27826: done queuing things up, now waiting for results queue to drain 44071 1727204673.27829: waiting for pending results... 44071 1727204673.28191: running TaskExecutor() for managed-node2/TASK: Get NM profile info 44071 1727204673.28359: in run() - task 127b8e07-fff9-c964-7471-000000001668 44071 1727204673.28385: variable 'ansible_search_path' from source: unknown 44071 1727204673.28395: variable 'ansible_search_path' from source: unknown 44071 1727204673.28445: calling self._execute() 44071 1727204673.28559: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204673.28573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204673.28586: variable 'omit' from source: magic vars 44071 1727204673.29050: variable 'ansible_distribution_major_version' from source: facts 44071 1727204673.29078: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204673.29090: variable 'omit' from source: magic vars 44071 1727204673.29150: variable 'omit' from source: magic vars 44071 1727204673.29274: variable 'profile' from source: play vars 44071 1727204673.29292: variable 'interface' from source: play vars 44071 1727204673.29381: variable 'interface' from source: play vars 44071 1727204673.29415: variable 'omit' from source: magic vars 44071 1727204673.29476: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204673.29537: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204673.29574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204673.29604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204673.29629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204673.29671: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204673.29681: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204673.29689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204673.29807: Set connection var ansible_connection to ssh 44071 1727204673.29821: Set connection var ansible_timeout to 10 44071 1727204673.29845: Set connection var ansible_pipelining to False 44071 1727204673.29870: Set connection var ansible_shell_type to sh 44071 1727204673.29873: Set connection var ansible_shell_executable to /bin/sh 44071 1727204673.29886: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204673.29941: variable 'ansible_shell_executable' from source: unknown 44071 1727204673.29945: variable 'ansible_connection' from source: unknown 44071 1727204673.29948: variable 'ansible_module_compression' from source: unknown 44071 1727204673.29950: variable 'ansible_shell_type' from source: unknown 44071 1727204673.29953: variable 'ansible_shell_executable' from source: unknown 44071 1727204673.29956: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204673.30049: variable 'ansible_pipelining' from source: unknown 44071 1727204673.30053: variable 'ansible_timeout' from source: unknown 44071 1727204673.30058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204673.30174: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204673.30196: variable 'omit' from source: magic vars 44071 1727204673.30211: starting attempt loop 44071 1727204673.30220: running the handler 44071 1727204673.30241: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204673.30282: _low_level_execute_command(): starting 44071 1727204673.30302: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204673.31161: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204673.31207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204673.31251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204673.31474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204673.33125: stdout chunk (state=3): >>>/root <<< 44071 1727204673.33346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204673.33350: stdout chunk (state=3): >>><<< 44071 1727204673.33352: stderr chunk (state=3): >>><<< 44071 1727204673.33386: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204673.33501: _low_level_execute_command(): starting 44071 1727204673.33506: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204673.3339503-48787-88034262628992 `" && echo ansible-tmp-1727204673.3339503-48787-88034262628992="` echo /root/.ansible/tmp/ansible-tmp-1727204673.3339503-48787-88034262628992 `" ) && sleep 0' 44071 1727204673.34119: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204673.34133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204673.34156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204673.34284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204673.34310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204673.34418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204673.36628: stdout chunk (state=3): >>>ansible-tmp-1727204673.3339503-48787-88034262628992=/root/.ansible/tmp/ansible-tmp-1727204673.3339503-48787-88034262628992 <<< 44071 1727204673.36719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204673.36757: stderr chunk (state=3): >>><<< 44071 1727204673.36768: stdout chunk (state=3): >>><<< 44071 1727204673.36795: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204673.3339503-48787-88034262628992=/root/.ansible/tmp/ansible-tmp-1727204673.3339503-48787-88034262628992 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204673.36846: variable 'ansible_module_compression' from source: unknown 44071 1727204673.37029: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204673.37173: variable 'ansible_facts' from source: unknown 44071 1727204673.37329: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204673.3339503-48787-88034262628992/AnsiballZ_command.py 44071 1727204673.37536: Sending initial data 44071 1727204673.37544: Sent initial data (155 bytes) 44071 1727204673.38263: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204673.38298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204673.38348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204673.38421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204673.38462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204673.38569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204673.40341: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204673.40377: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204673.40458: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmplx8bu46h /root/.ansible/tmp/ansible-tmp-1727204673.3339503-48787-88034262628992/AnsiballZ_command.py <<< 44071 1727204673.40462: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204673.3339503-48787-88034262628992/AnsiballZ_command.py" <<< 44071 1727204673.40581: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmplx8bu46h" to remote "/root/.ansible/tmp/ansible-tmp-1727204673.3339503-48787-88034262628992/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204673.3339503-48787-88034262628992/AnsiballZ_command.py" <<< 44071 1727204673.41432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204673.41519: stderr chunk (state=3): >>><<< 44071 1727204673.41533: stdout chunk (state=3): >>><<< 44071 1727204673.41570: done transferring module to remote 44071 1727204673.41588: _low_level_execute_command(): starting 44071 1727204673.41599: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204673.3339503-48787-88034262628992/ /root/.ansible/tmp/ansible-tmp-1727204673.3339503-48787-88034262628992/AnsiballZ_command.py && sleep 0' 44071 1727204673.42403: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204673.42454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204673.42473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204673.42628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204673.42827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204673.43108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204673.44812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204673.44818: stdout chunk (state=3): >>><<< 44071 1727204673.44824: stderr chunk (state=3): >>><<< 44071 1727204673.44853: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204673.44858: _low_level_execute_command(): starting 44071 1727204673.44861: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204673.3339503-48787-88034262628992/AnsiballZ_command.py && sleep 0' 44071 1727204673.45589: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204673.45598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204673.45610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204673.45672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204673.45676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204673.45679: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204673.45681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204673.45684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204673.45750: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204673.45783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204673.45843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204673.45850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204673.45916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204673.64431: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:04:33.623382", "end": "2024-09-24 15:04:33.641639", "delta": "0:00:00.018257", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204673.66117: stderr chunk (state=3): >>>debug2: Received exit status from master 1 <<< 44071 1727204673.66259: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 44071 1727204673.66360: stderr chunk (state=3): >>><<< 44071 1727204673.66455: stdout chunk (state=3): >>><<< 44071 1727204673.66600: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:04:33.623382", "end": "2024-09-24 15:04:33.641639", "delta": "0:00:00.018257", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. 44071 1727204673.66604: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204673.3339503-48787-88034262628992/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204673.66608: _low_level_execute_command(): starting 44071 1727204673.66658: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204673.3339503-48787-88034262628992/ > /dev/null 2>&1 && sleep 0' 44071 1727204673.68309: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204673.68355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204673.68410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204673.68436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204673.68541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204673.70615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204673.70620: stdout chunk (state=3): >>><<< 44071 1727204673.70623: stderr chunk (state=3): >>><<< 44071 1727204673.70743: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204673.70749: handler run complete 44071 1727204673.70752: Evaluated conditional (False): False 44071 1727204673.70756: attempt loop complete, returning result 44071 1727204673.70759: _execute() done 44071 1727204673.70761: dumping result to json 44071 1727204673.70764: done dumping result, returning 44071 1727204673.70769: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [127b8e07-fff9-c964-7471-000000001668] 44071 1727204673.70772: sending task result for task 127b8e07-fff9-c964-7471-000000001668 44071 1727204673.70941: done sending task result for task 127b8e07-fff9-c964-7471-000000001668 44071 1727204673.71168: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.018257", "end": "2024-09-24 15:04:33.641639", "rc": 1, "start": "2024-09-24 15:04:33.623382" } MSG: non-zero return code ...ignoring 44071 1727204673.71262: no more pending results, returning what we have 44071 1727204673.71268: results queue empty 44071 1727204673.71269: checking for any_errors_fatal 44071 1727204673.71275: done checking for any_errors_fatal 44071 1727204673.71276: checking for max_fail_percentage 44071 1727204673.71278: done checking for max_fail_percentage 44071 1727204673.71279: checking to see if all hosts have failed and the running result is not ok 44071 1727204673.71279: done checking to see if all hosts have failed 44071 1727204673.71280: getting the remaining hosts for this loop 44071 1727204673.71282: done getting the remaining hosts for this loop 44071 1727204673.71286: getting the next task for host managed-node2 44071 1727204673.71295: done getting next task for host managed-node2 44071 1727204673.71298: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44071 1727204673.71305: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204673.71312: getting variables 44071 1727204673.71314: in VariableManager get_vars() 44071 1727204673.71352: Calling all_inventory to load vars for managed-node2 44071 1727204673.71356: Calling groups_inventory to load vars for managed-node2 44071 1727204673.71359: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204673.71464: Calling all_plugins_play to load vars for managed-node2 44071 1727204673.71472: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204673.71484: Calling groups_plugins_play to load vars for managed-node2 44071 1727204673.73941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204673.77557: done with get_vars() 44071 1727204673.77722: done getting variables 44071 1727204673.77918: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:04:33 -0400 (0:00:00.507) 0:01:26.096 ***** 44071 1727204673.77955: entering _queue_task() for managed-node2/set_fact 44071 1727204673.78801: worker is 1 (out of 1 available) 44071 1727204673.78818: exiting _queue_task() for managed-node2/set_fact 44071 1727204673.79468: done queuing things up, now waiting for results queue to drain 44071 1727204673.79471: waiting for pending results... 44071 1727204673.79812: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44071 1727204673.80239: in run() - task 127b8e07-fff9-c964-7471-000000001669 44071 1727204673.80252: variable 'ansible_search_path' from source: unknown 44071 1727204673.80257: variable 'ansible_search_path' from source: unknown 44071 1727204673.80410: calling self._execute() 44071 1727204673.80625: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204673.80630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204673.80636: variable 'omit' from source: magic vars 44071 1727204673.81067: variable 'ansible_distribution_major_version' from source: facts 44071 1727204673.81089: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204673.81248: variable 'nm_profile_exists' from source: set_fact 44071 1727204673.81275: Evaluated conditional (nm_profile_exists.rc == 0): False 44071 1727204673.81284: when evaluation is False, skipping this task 44071 1727204673.81292: _execute() done 44071 1727204673.81300: dumping result to json 44071 1727204673.81307: done dumping result, returning 44071 1727204673.81319: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-c964-7471-000000001669] 44071 1727204673.81330: sending task result for task 127b8e07-fff9-c964-7471-000000001669 skipping: [managed-node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 44071 1727204673.81519: no more pending results, returning what we have 44071 1727204673.81523: results queue empty 44071 1727204673.81524: checking for any_errors_fatal 44071 1727204673.81537: done checking for any_errors_fatal 44071 1727204673.81538: checking for max_fail_percentage 44071 1727204673.81540: done checking for max_fail_percentage 44071 1727204673.81541: checking to see if all hosts have failed and the running result is not ok 44071 1727204673.81542: done checking to see if all hosts have failed 44071 1727204673.81542: getting the remaining hosts for this loop 44071 1727204673.81544: done getting the remaining hosts for this loop 44071 1727204673.81549: getting the next task for host managed-node2 44071 1727204673.81563: done getting next task for host managed-node2 44071 1727204673.81567: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 44071 1727204673.81574: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204673.81578: getting variables 44071 1727204673.81580: in VariableManager get_vars() 44071 1727204673.81623: Calling all_inventory to load vars for managed-node2 44071 1727204673.81626: Calling groups_inventory to load vars for managed-node2 44071 1727204673.81630: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204673.81644: Calling all_plugins_play to load vars for managed-node2 44071 1727204673.81647: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204673.81650: Calling groups_plugins_play to load vars for managed-node2 44071 1727204673.82447: done sending task result for task 127b8e07-fff9-c964-7471-000000001669 44071 1727204673.82452: WORKER PROCESS EXITING 44071 1727204673.85476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204673.89844: done with get_vars() 44071 1727204673.89894: done getting variables 44071 1727204673.89971: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204673.90107: variable 'profile' from source: play vars 44071 1727204673.90111: variable 'interface' from source: play vars 44071 1727204673.90182: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:04:33 -0400 (0:00:00.122) 0:01:26.218 ***** 44071 1727204673.90217: entering _queue_task() for managed-node2/command 44071 1727204673.90755: worker is 1 (out of 1 available) 44071 1727204673.90774: exiting _queue_task() for managed-node2/command 44071 1727204673.90791: done queuing things up, now waiting for results queue to drain 44071 1727204673.90793: waiting for pending results... 44071 1727204673.91122: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr 44071 1727204673.91285: in run() - task 127b8e07-fff9-c964-7471-00000000166b 44071 1727204673.91303: variable 'ansible_search_path' from source: unknown 44071 1727204673.91307: variable 'ansible_search_path' from source: unknown 44071 1727204673.91352: calling self._execute() 44071 1727204673.91463: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204673.91470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204673.91485: variable 'omit' from source: magic vars 44071 1727204673.91945: variable 'ansible_distribution_major_version' from source: facts 44071 1727204673.91978: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204673.92105: variable 'profile_stat' from source: set_fact 44071 1727204673.92139: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204673.92143: when evaluation is False, skipping this task 44071 1727204673.92147: _execute() done 44071 1727204673.92151: dumping result to json 44071 1727204673.92154: done dumping result, returning 44071 1727204673.92190: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr [127b8e07-fff9-c964-7471-00000000166b] 44071 1727204673.92194: sending task result for task 127b8e07-fff9-c964-7471-00000000166b 44071 1727204673.92417: done sending task result for task 127b8e07-fff9-c964-7471-00000000166b 44071 1727204673.92421: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204673.92482: no more pending results, returning what we have 44071 1727204673.92486: results queue empty 44071 1727204673.92487: checking for any_errors_fatal 44071 1727204673.92495: done checking for any_errors_fatal 44071 1727204673.92496: checking for max_fail_percentage 44071 1727204673.92498: done checking for max_fail_percentage 44071 1727204673.92498: checking to see if all hosts have failed and the running result is not ok 44071 1727204673.92499: done checking to see if all hosts have failed 44071 1727204673.92500: getting the remaining hosts for this loop 44071 1727204673.92502: done getting the remaining hosts for this loop 44071 1727204673.92507: getting the next task for host managed-node2 44071 1727204673.92516: done getting next task for host managed-node2 44071 1727204673.92519: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 44071 1727204673.92525: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204673.92530: getting variables 44071 1727204673.92534: in VariableManager get_vars() 44071 1727204673.92577: Calling all_inventory to load vars for managed-node2 44071 1727204673.92581: Calling groups_inventory to load vars for managed-node2 44071 1727204673.92585: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204673.92599: Calling all_plugins_play to load vars for managed-node2 44071 1727204673.92603: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204673.92606: Calling groups_plugins_play to load vars for managed-node2 44071 1727204673.96185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204674.00834: done with get_vars() 44071 1727204674.00884: done getting variables 44071 1727204674.00959: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204674.01095: variable 'profile' from source: play vars 44071 1727204674.01099: variable 'interface' from source: play vars 44071 1727204674.01203: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:04:34 -0400 (0:00:00.110) 0:01:26.328 ***** 44071 1727204674.01267: entering _queue_task() for managed-node2/set_fact 44071 1727204674.01893: worker is 1 (out of 1 available) 44071 1727204674.01909: exiting _queue_task() for managed-node2/set_fact 44071 1727204674.01923: done queuing things up, now waiting for results queue to drain 44071 1727204674.01925: waiting for pending results... 44071 1727204674.02485: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 44071 1727204674.03114: in run() - task 127b8e07-fff9-c964-7471-00000000166c 44071 1727204674.03120: variable 'ansible_search_path' from source: unknown 44071 1727204674.03124: variable 'ansible_search_path' from source: unknown 44071 1727204674.03127: calling self._execute() 44071 1727204674.03223: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204674.03226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204674.03228: variable 'omit' from source: magic vars 44071 1727204674.03890: variable 'ansible_distribution_major_version' from source: facts 44071 1727204674.03895: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204674.04054: variable 'profile_stat' from source: set_fact 44071 1727204674.04073: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204674.04077: when evaluation is False, skipping this task 44071 1727204674.04109: _execute() done 44071 1727204674.04113: dumping result to json 44071 1727204674.04116: done dumping result, returning 44071 1727204674.04119: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [127b8e07-fff9-c964-7471-00000000166c] 44071 1727204674.04121: sending task result for task 127b8e07-fff9-c964-7471-00000000166c skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204674.04560: no more pending results, returning what we have 44071 1727204674.04564: results queue empty 44071 1727204674.04568: checking for any_errors_fatal 44071 1727204674.04575: done checking for any_errors_fatal 44071 1727204674.04576: checking for max_fail_percentage 44071 1727204674.04577: done checking for max_fail_percentage 44071 1727204674.04578: checking to see if all hosts have failed and the running result is not ok 44071 1727204674.04579: done checking to see if all hosts have failed 44071 1727204674.04580: getting the remaining hosts for this loop 44071 1727204674.04581: done getting the remaining hosts for this loop 44071 1727204674.04585: getting the next task for host managed-node2 44071 1727204674.04594: done getting next task for host managed-node2 44071 1727204674.04597: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 44071 1727204674.04603: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204674.04607: getting variables 44071 1727204674.04608: in VariableManager get_vars() 44071 1727204674.04644: Calling all_inventory to load vars for managed-node2 44071 1727204674.04647: Calling groups_inventory to load vars for managed-node2 44071 1727204674.04650: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204674.04663: Calling all_plugins_play to load vars for managed-node2 44071 1727204674.04711: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204674.04718: done sending task result for task 127b8e07-fff9-c964-7471-00000000166c 44071 1727204674.04722: WORKER PROCESS EXITING 44071 1727204674.04742: Calling groups_plugins_play to load vars for managed-node2 44071 1727204674.06739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204674.09321: done with get_vars() 44071 1727204674.09369: done getting variables 44071 1727204674.09448: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204674.09584: variable 'profile' from source: play vars 44071 1727204674.09589: variable 'interface' from source: play vars 44071 1727204674.09661: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:04:34 -0400 (0:00:00.084) 0:01:26.413 ***** 44071 1727204674.09698: entering _queue_task() for managed-node2/command 44071 1727204674.10318: worker is 1 (out of 1 available) 44071 1727204674.10332: exiting _queue_task() for managed-node2/command 44071 1727204674.10345: done queuing things up, now waiting for results queue to drain 44071 1727204674.10348: waiting for pending results... 44071 1727204674.10625: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr 44071 1727204674.10792: in run() - task 127b8e07-fff9-c964-7471-00000000166d 44071 1727204674.10824: variable 'ansible_search_path' from source: unknown 44071 1727204674.10827: variable 'ansible_search_path' from source: unknown 44071 1727204674.10851: calling self._execute() 44071 1727204674.10971: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204674.11008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204674.11012: variable 'omit' from source: magic vars 44071 1727204674.11463: variable 'ansible_distribution_major_version' from source: facts 44071 1727204674.11584: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204674.11635: variable 'profile_stat' from source: set_fact 44071 1727204674.11653: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204674.11661: when evaluation is False, skipping this task 44071 1727204674.11670: _execute() done 44071 1727204674.11679: dumping result to json 44071 1727204674.11692: done dumping result, returning 44071 1727204674.11707: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr [127b8e07-fff9-c964-7471-00000000166d] 44071 1727204674.11717: sending task result for task 127b8e07-fff9-c964-7471-00000000166d skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204674.11969: no more pending results, returning what we have 44071 1727204674.11974: results queue empty 44071 1727204674.11976: checking for any_errors_fatal 44071 1727204674.11985: done checking for any_errors_fatal 44071 1727204674.11986: checking for max_fail_percentage 44071 1727204674.11988: done checking for max_fail_percentage 44071 1727204674.11989: checking to see if all hosts have failed and the running result is not ok 44071 1727204674.11990: done checking to see if all hosts have failed 44071 1727204674.11991: getting the remaining hosts for this loop 44071 1727204674.11993: done getting the remaining hosts for this loop 44071 1727204674.11998: getting the next task for host managed-node2 44071 1727204674.12009: done getting next task for host managed-node2 44071 1727204674.12071: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 44071 1727204674.12078: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204674.12083: getting variables 44071 1727204674.12085: in VariableManager get_vars() 44071 1727204674.12247: Calling all_inventory to load vars for managed-node2 44071 1727204674.12251: Calling groups_inventory to load vars for managed-node2 44071 1727204674.12255: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204674.12273: Calling all_plugins_play to load vars for managed-node2 44071 1727204674.12277: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204674.12280: Calling groups_plugins_play to load vars for managed-node2 44071 1727204674.13259: done sending task result for task 127b8e07-fff9-c964-7471-00000000166d 44071 1727204674.13264: WORKER PROCESS EXITING 44071 1727204674.16649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204674.20295: done with get_vars() 44071 1727204674.20339: done getting variables 44071 1727204674.20415: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204674.20548: variable 'profile' from source: play vars 44071 1727204674.20553: variable 'interface' from source: play vars 44071 1727204674.20621: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:04:34 -0400 (0:00:00.109) 0:01:26.523 ***** 44071 1727204674.20655: entering _queue_task() for managed-node2/set_fact 44071 1727204674.21079: worker is 1 (out of 1 available) 44071 1727204674.21096: exiting _queue_task() for managed-node2/set_fact 44071 1727204674.21113: done queuing things up, now waiting for results queue to drain 44071 1727204674.21115: waiting for pending results... 44071 1727204674.21579: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr 44071 1727204674.21819: in run() - task 127b8e07-fff9-c964-7471-00000000166e 44071 1727204674.21858: variable 'ansible_search_path' from source: unknown 44071 1727204674.21863: variable 'ansible_search_path' from source: unknown 44071 1727204674.21939: calling self._execute() 44071 1727204674.22082: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204674.22088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204674.22098: variable 'omit' from source: magic vars 44071 1727204674.22746: variable 'ansible_distribution_major_version' from source: facts 44071 1727204674.22809: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204674.22989: variable 'profile_stat' from source: set_fact 44071 1727204674.23038: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204674.23044: when evaluation is False, skipping this task 44071 1727204674.23048: _execute() done 44071 1727204674.23052: dumping result to json 44071 1727204674.23054: done dumping result, returning 44071 1727204674.23058: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr [127b8e07-fff9-c964-7471-00000000166e] 44071 1727204674.23060: sending task result for task 127b8e07-fff9-c964-7471-00000000166e skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204674.23401: no more pending results, returning what we have 44071 1727204674.23407: results queue empty 44071 1727204674.23409: checking for any_errors_fatal 44071 1727204674.23413: done checking for any_errors_fatal 44071 1727204674.23414: checking for max_fail_percentage 44071 1727204674.23416: done checking for max_fail_percentage 44071 1727204674.23417: checking to see if all hosts have failed and the running result is not ok 44071 1727204674.23417: done checking to see if all hosts have failed 44071 1727204674.23418: getting the remaining hosts for this loop 44071 1727204674.23419: done getting the remaining hosts for this loop 44071 1727204674.23423: getting the next task for host managed-node2 44071 1727204674.23432: done getting next task for host managed-node2 44071 1727204674.23435: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 44071 1727204674.23439: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204674.23444: getting variables 44071 1727204674.23571: in VariableManager get_vars() 44071 1727204674.23612: Calling all_inventory to load vars for managed-node2 44071 1727204674.23616: Calling groups_inventory to load vars for managed-node2 44071 1727204674.23620: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204674.23642: done sending task result for task 127b8e07-fff9-c964-7471-00000000166e 44071 1727204674.23645: WORKER PROCESS EXITING 44071 1727204674.23657: Calling all_plugins_play to load vars for managed-node2 44071 1727204674.23660: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204674.23664: Calling groups_plugins_play to load vars for managed-node2 44071 1727204674.26101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204674.28919: done with get_vars() 44071 1727204674.29025: done getting variables 44071 1727204674.29125: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204674.29283: variable 'profile' from source: play vars 44071 1727204674.29320: variable 'interface' from source: play vars 44071 1727204674.29450: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 15:04:34 -0400 (0:00:00.088) 0:01:26.611 ***** 44071 1727204674.29500: entering _queue_task() for managed-node2/assert 44071 1727204674.30061: worker is 1 (out of 1 available) 44071 1727204674.30080: exiting _queue_task() for managed-node2/assert 44071 1727204674.30097: done queuing things up, now waiting for results queue to drain 44071 1727204674.30099: waiting for pending results... 44071 1727204674.30547: running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'statebr' 44071 1727204674.30727: in run() - task 127b8e07-fff9-c964-7471-0000000015d5 44071 1727204674.30796: variable 'ansible_search_path' from source: unknown 44071 1727204674.30802: variable 'ansible_search_path' from source: unknown 44071 1727204674.30806: calling self._execute() 44071 1727204674.30929: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204674.30948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204674.30963: variable 'omit' from source: magic vars 44071 1727204674.31433: variable 'ansible_distribution_major_version' from source: facts 44071 1727204674.31460: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204674.31477: variable 'omit' from source: magic vars 44071 1727204674.31544: variable 'omit' from source: magic vars 44071 1727204674.31682: variable 'profile' from source: play vars 44071 1727204674.31693: variable 'interface' from source: play vars 44071 1727204674.31776: variable 'interface' from source: play vars 44071 1727204674.31868: variable 'omit' from source: magic vars 44071 1727204674.31873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204674.31902: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204674.31931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204674.31954: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204674.31977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204674.32020: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204674.32031: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204674.32039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204674.32193: Set connection var ansible_connection to ssh 44071 1727204674.32196: Set connection var ansible_timeout to 10 44071 1727204674.32206: Set connection var ansible_pipelining to False 44071 1727204674.32219: Set connection var ansible_shell_type to sh 44071 1727204674.32270: Set connection var ansible_shell_executable to /bin/sh 44071 1727204674.32274: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204674.32279: variable 'ansible_shell_executable' from source: unknown 44071 1727204674.32288: variable 'ansible_connection' from source: unknown 44071 1727204674.32304: variable 'ansible_module_compression' from source: unknown 44071 1727204674.32320: variable 'ansible_shell_type' from source: unknown 44071 1727204674.32328: variable 'ansible_shell_executable' from source: unknown 44071 1727204674.32336: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204674.32345: variable 'ansible_pipelining' from source: unknown 44071 1727204674.32424: variable 'ansible_timeout' from source: unknown 44071 1727204674.32427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204674.32546: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204674.32570: variable 'omit' from source: magic vars 44071 1727204674.32586: starting attempt loop 44071 1727204674.32611: running the handler 44071 1727204674.32806: variable 'lsr_net_profile_exists' from source: set_fact 44071 1727204674.32859: Evaluated conditional (not lsr_net_profile_exists): True 44071 1727204674.32866: handler run complete 44071 1727204674.32873: attempt loop complete, returning result 44071 1727204674.32877: _execute() done 44071 1727204674.32883: dumping result to json 44071 1727204674.32896: done dumping result, returning 44071 1727204674.32909: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'statebr' [127b8e07-fff9-c964-7471-0000000015d5] 44071 1727204674.32918: sending task result for task 127b8e07-fff9-c964-7471-0000000015d5 44071 1727204674.33247: done sending task result for task 127b8e07-fff9-c964-7471-0000000015d5 44071 1727204674.33250: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204674.33317: no more pending results, returning what we have 44071 1727204674.33323: results queue empty 44071 1727204674.33324: checking for any_errors_fatal 44071 1727204674.33334: done checking for any_errors_fatal 44071 1727204674.33335: checking for max_fail_percentage 44071 1727204674.33337: done checking for max_fail_percentage 44071 1727204674.33338: checking to see if all hosts have failed and the running result is not ok 44071 1727204674.33339: done checking to see if all hosts have failed 44071 1727204674.33340: getting the remaining hosts for this loop 44071 1727204674.33342: done getting the remaining hosts for this loop 44071 1727204674.33348: getting the next task for host managed-node2 44071 1727204674.33380: done getting next task for host managed-node2 44071 1727204674.33387: ^ task is: TASK: Conditional asserts 44071 1727204674.33391: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204674.33397: getting variables 44071 1727204674.33399: in VariableManager get_vars() 44071 1727204674.33448: Calling all_inventory to load vars for managed-node2 44071 1727204674.33452: Calling groups_inventory to load vars for managed-node2 44071 1727204674.33459: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204674.33593: Calling all_plugins_play to load vars for managed-node2 44071 1727204674.33597: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204674.33602: Calling groups_plugins_play to load vars for managed-node2 44071 1727204674.35895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204674.39420: done with get_vars() 44071 1727204674.39575: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 15:04:34 -0400 (0:00:00.105) 0:01:26.717 ***** 44071 1727204674.40082: entering _queue_task() for managed-node2/include_tasks 44071 1727204674.41196: worker is 1 (out of 1 available) 44071 1727204674.41212: exiting _queue_task() for managed-node2/include_tasks 44071 1727204674.41228: done queuing things up, now waiting for results queue to drain 44071 1727204674.41229: waiting for pending results... 44071 1727204674.41784: running TaskExecutor() for managed-node2/TASK: Conditional asserts 44071 1727204674.42388: in run() - task 127b8e07-fff9-c964-7471-00000000100b 44071 1727204674.42394: variable 'ansible_search_path' from source: unknown 44071 1727204674.42398: variable 'ansible_search_path' from source: unknown 44071 1727204674.43050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204674.62805: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204674.63155: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204674.63199: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204674.63233: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204674.63263: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204674.63345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204674.63378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204674.63406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204674.63452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204674.63469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204674.63599: dumping result to json 44071 1727204674.63604: done dumping result, returning 44071 1727204674.63610: done running TaskExecutor() for managed-node2/TASK: Conditional asserts [127b8e07-fff9-c964-7471-00000000100b] 44071 1727204674.63612: sending task result for task 127b8e07-fff9-c964-7471-00000000100b 44071 1727204674.63992: done sending task result for task 127b8e07-fff9-c964-7471-00000000100b 44071 1727204674.63996: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } 44071 1727204674.64213: no more pending results, returning what we have 44071 1727204674.64217: results queue empty 44071 1727204674.64218: checking for any_errors_fatal 44071 1727204674.64224: done checking for any_errors_fatal 44071 1727204674.64225: checking for max_fail_percentage 44071 1727204674.64226: done checking for max_fail_percentage 44071 1727204674.64227: checking to see if all hosts have failed and the running result is not ok 44071 1727204674.64228: done checking to see if all hosts have failed 44071 1727204674.64229: getting the remaining hosts for this loop 44071 1727204674.64230: done getting the remaining hosts for this loop 44071 1727204674.64234: getting the next task for host managed-node2 44071 1727204674.64241: done getting next task for host managed-node2 44071 1727204674.64243: ^ task is: TASK: Success in test '{{ lsr_description }}' 44071 1727204674.64246: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204674.64249: getting variables 44071 1727204674.64251: in VariableManager get_vars() 44071 1727204674.64289: Calling all_inventory to load vars for managed-node2 44071 1727204674.64292: Calling groups_inventory to load vars for managed-node2 44071 1727204674.64295: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204674.64305: Calling all_plugins_play to load vars for managed-node2 44071 1727204674.64308: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204674.64311: Calling groups_plugins_play to load vars for managed-node2 44071 1727204674.76750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204674.79452: done with get_vars() 44071 1727204674.79502: done getting variables 44071 1727204674.79578: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204674.79713: variable 'lsr_description' from source: include params TASK [Success in test 'I can remove an existing profile without taking it down'] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 15:04:34 -0400 (0:00:00.396) 0:01:27.113 ***** 44071 1727204674.79743: entering _queue_task() for managed-node2/debug 44071 1727204674.80388: worker is 1 (out of 1 available) 44071 1727204674.80402: exiting _queue_task() for managed-node2/debug 44071 1727204674.80416: done queuing things up, now waiting for results queue to drain 44071 1727204674.80418: waiting for pending results... 44071 1727204674.80661: running TaskExecutor() for managed-node2/TASK: Success in test 'I can remove an existing profile without taking it down' 44071 1727204674.80738: in run() - task 127b8e07-fff9-c964-7471-00000000100c 44071 1727204674.80772: variable 'ansible_search_path' from source: unknown 44071 1727204674.80782: variable 'ansible_search_path' from source: unknown 44071 1727204674.80864: calling self._execute() 44071 1727204674.80951: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204674.80975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204674.80994: variable 'omit' from source: magic vars 44071 1727204674.81489: variable 'ansible_distribution_major_version' from source: facts 44071 1727204674.81573: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204674.81578: variable 'omit' from source: magic vars 44071 1727204674.81589: variable 'omit' from source: magic vars 44071 1727204674.81733: variable 'lsr_description' from source: include params 44071 1727204674.81772: variable 'omit' from source: magic vars 44071 1727204674.81829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204674.81885: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204674.81911: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204674.81961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204674.81964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204674.81996: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204674.82007: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204674.82016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204674.82137: Set connection var ansible_connection to ssh 44071 1727204674.82172: Set connection var ansible_timeout to 10 44071 1727204674.82179: Set connection var ansible_pipelining to False 44071 1727204674.82186: Set connection var ansible_shell_type to sh 44071 1727204674.82270: Set connection var ansible_shell_executable to /bin/sh 44071 1727204674.82274: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204674.82278: variable 'ansible_shell_executable' from source: unknown 44071 1727204674.82281: variable 'ansible_connection' from source: unknown 44071 1727204674.82283: variable 'ansible_module_compression' from source: unknown 44071 1727204674.82285: variable 'ansible_shell_type' from source: unknown 44071 1727204674.82287: variable 'ansible_shell_executable' from source: unknown 44071 1727204674.82293: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204674.82296: variable 'ansible_pipelining' from source: unknown 44071 1727204674.82298: variable 'ansible_timeout' from source: unknown 44071 1727204674.82300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204674.82471: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204674.82475: variable 'omit' from source: magic vars 44071 1727204674.82521: starting attempt loop 44071 1727204674.82524: running the handler 44071 1727204674.82556: handler run complete 44071 1727204674.82580: attempt loop complete, returning result 44071 1727204674.82587: _execute() done 44071 1727204674.82594: dumping result to json 44071 1727204674.82601: done dumping result, returning 44071 1727204674.82613: done running TaskExecutor() for managed-node2/TASK: Success in test 'I can remove an existing profile without taking it down' [127b8e07-fff9-c964-7471-00000000100c] 44071 1727204674.82628: sending task result for task 127b8e07-fff9-c964-7471-00000000100c ok: [managed-node2] => {} MSG: +++++ Success in test 'I can remove an existing profile without taking it down' +++++ 44071 1727204674.82899: no more pending results, returning what we have 44071 1727204674.82904: results queue empty 44071 1727204674.82905: checking for any_errors_fatal 44071 1727204674.82920: done checking for any_errors_fatal 44071 1727204674.82921: checking for max_fail_percentage 44071 1727204674.82922: done checking for max_fail_percentage 44071 1727204674.82923: checking to see if all hosts have failed and the running result is not ok 44071 1727204674.82924: done checking to see if all hosts have failed 44071 1727204674.82925: getting the remaining hosts for this loop 44071 1727204674.82927: done getting the remaining hosts for this loop 44071 1727204674.82933: getting the next task for host managed-node2 44071 1727204674.82942: done getting next task for host managed-node2 44071 1727204674.82947: ^ task is: TASK: Cleanup 44071 1727204674.82950: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204674.82957: getting variables 44071 1727204674.82959: in VariableManager get_vars() 44071 1727204674.83207: Calling all_inventory to load vars for managed-node2 44071 1727204674.83210: Calling groups_inventory to load vars for managed-node2 44071 1727204674.83217: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204674.83229: Calling all_plugins_play to load vars for managed-node2 44071 1727204674.83232: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204674.83236: Calling groups_plugins_play to load vars for managed-node2 44071 1727204674.83784: done sending task result for task 127b8e07-fff9-c964-7471-00000000100c 44071 1727204674.83789: WORKER PROCESS EXITING 44071 1727204674.85380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204674.87805: done with get_vars() 44071 1727204674.87858: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 15:04:34 -0400 (0:00:00.082) 0:01:27.196 ***** 44071 1727204674.87984: entering _queue_task() for managed-node2/include_tasks 44071 1727204674.88538: worker is 1 (out of 1 available) 44071 1727204674.88554: exiting _queue_task() for managed-node2/include_tasks 44071 1727204674.88570: done queuing things up, now waiting for results queue to drain 44071 1727204674.88572: waiting for pending results... 44071 1727204674.88854: running TaskExecutor() for managed-node2/TASK: Cleanup 44071 1727204674.89015: in run() - task 127b8e07-fff9-c964-7471-000000001010 44071 1727204674.89047: variable 'ansible_search_path' from source: unknown 44071 1727204674.89059: variable 'ansible_search_path' from source: unknown 44071 1727204674.89124: variable 'lsr_cleanup' from source: include params 44071 1727204674.89417: variable 'lsr_cleanup' from source: include params 44071 1727204674.89541: variable 'omit' from source: magic vars 44071 1727204674.89747: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204674.89769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204674.89791: variable 'omit' from source: magic vars 44071 1727204674.90129: variable 'ansible_distribution_major_version' from source: facts 44071 1727204674.90152: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204674.90168: variable 'item' from source: unknown 44071 1727204674.90258: variable 'item' from source: unknown 44071 1727204674.90306: variable 'item' from source: unknown 44071 1727204674.90384: variable 'item' from source: unknown 44071 1727204674.90696: dumping result to json 44071 1727204674.90700: done dumping result, returning 44071 1727204674.90703: done running TaskExecutor() for managed-node2/TASK: Cleanup [127b8e07-fff9-c964-7471-000000001010] 44071 1727204674.90707: sending task result for task 127b8e07-fff9-c964-7471-000000001010 44071 1727204674.90760: done sending task result for task 127b8e07-fff9-c964-7471-000000001010 44071 1727204674.90764: WORKER PROCESS EXITING 44071 1727204674.90900: no more pending results, returning what we have 44071 1727204674.90905: in VariableManager get_vars() 44071 1727204674.90955: Calling all_inventory to load vars for managed-node2 44071 1727204674.90958: Calling groups_inventory to load vars for managed-node2 44071 1727204674.90962: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204674.90978: Calling all_plugins_play to load vars for managed-node2 44071 1727204674.90981: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204674.90984: Calling groups_plugins_play to load vars for managed-node2 44071 1727204674.93826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204674.97992: done with get_vars() 44071 1727204674.98038: variable 'ansible_search_path' from source: unknown 44071 1727204674.98040: variable 'ansible_search_path' from source: unknown 44071 1727204674.98093: we have included files to process 44071 1727204674.98094: generating all_blocks data 44071 1727204674.98098: done generating all_blocks data 44071 1727204674.98104: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204674.98106: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204674.98109: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204674.98345: done processing included file 44071 1727204674.98348: iterating over new_blocks loaded from include file 44071 1727204674.98350: in VariableManager get_vars() 44071 1727204674.98375: done with get_vars() 44071 1727204674.98377: filtering new block on tags 44071 1727204674.98436: done filtering new block on tags 44071 1727204674.98439: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed-node2 => (item=tasks/cleanup_profile+device.yml) 44071 1727204674.98444: extending task lists for all hosts with included blocks 44071 1727204675.00003: done extending task lists 44071 1727204675.00005: done processing included files 44071 1727204675.00005: results queue empty 44071 1727204675.00006: checking for any_errors_fatal 44071 1727204675.00013: done checking for any_errors_fatal 44071 1727204675.00015: checking for max_fail_percentage 44071 1727204675.00016: done checking for max_fail_percentage 44071 1727204675.00017: checking to see if all hosts have failed and the running result is not ok 44071 1727204675.00018: done checking to see if all hosts have failed 44071 1727204675.00019: getting the remaining hosts for this loop 44071 1727204675.00020: done getting the remaining hosts for this loop 44071 1727204675.00023: getting the next task for host managed-node2 44071 1727204675.00029: done getting next task for host managed-node2 44071 1727204675.00031: ^ task is: TASK: Cleanup profile and device 44071 1727204675.00035: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204675.00038: getting variables 44071 1727204675.00039: in VariableManager get_vars() 44071 1727204675.00055: Calling all_inventory to load vars for managed-node2 44071 1727204675.00057: Calling groups_inventory to load vars for managed-node2 44071 1727204675.00060: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204675.00069: Calling all_plugins_play to load vars for managed-node2 44071 1727204675.00071: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204675.00074: Calling groups_plugins_play to load vars for managed-node2 44071 1727204675.02444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204675.05101: done with get_vars() 44071 1727204675.05154: done getting variables 44071 1727204675.05233: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Tuesday 24 September 2024 15:04:35 -0400 (0:00:00.172) 0:01:27.369 ***** 44071 1727204675.05274: entering _queue_task() for managed-node2/shell 44071 1727204675.05849: worker is 1 (out of 1 available) 44071 1727204675.05865: exiting _queue_task() for managed-node2/shell 44071 1727204675.06019: done queuing things up, now waiting for results queue to drain 44071 1727204675.06021: waiting for pending results... 44071 1727204675.06279: running TaskExecutor() for managed-node2/TASK: Cleanup profile and device 44071 1727204675.06441: in run() - task 127b8e07-fff9-c964-7471-0000000016ad 44071 1727204675.06474: variable 'ansible_search_path' from source: unknown 44071 1727204675.06538: variable 'ansible_search_path' from source: unknown 44071 1727204675.06542: calling self._execute() 44071 1727204675.06651: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.06667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.06687: variable 'omit' from source: magic vars 44071 1727204675.07152: variable 'ansible_distribution_major_version' from source: facts 44071 1727204675.07176: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204675.07207: variable 'omit' from source: magic vars 44071 1727204675.07267: variable 'omit' from source: magic vars 44071 1727204675.07482: variable 'interface' from source: play vars 44071 1727204675.07530: variable 'omit' from source: magic vars 44071 1727204675.07589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204675.07675: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204675.07680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204675.07707: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.07725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.07773: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204675.07847: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.07852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.07955: Set connection var ansible_connection to ssh 44071 1727204675.07959: Set connection var ansible_timeout to 10 44071 1727204675.07965: Set connection var ansible_pipelining to False 44071 1727204675.07968: Set connection var ansible_shell_type to sh 44071 1727204675.07971: Set connection var ansible_shell_executable to /bin/sh 44071 1727204675.07984: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204675.08017: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.08064: variable 'ansible_connection' from source: unknown 44071 1727204675.08073: variable 'ansible_module_compression' from source: unknown 44071 1727204675.08077: variable 'ansible_shell_type' from source: unknown 44071 1727204675.08080: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.08082: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.08084: variable 'ansible_pipelining' from source: unknown 44071 1727204675.08086: variable 'ansible_timeout' from source: unknown 44071 1727204675.08089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.08235: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204675.08255: variable 'omit' from source: magic vars 44071 1727204675.08281: starting attempt loop 44071 1727204675.08284: running the handler 44071 1727204675.08290: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204675.08371: _low_level_execute_command(): starting 44071 1727204675.08374: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204675.09134: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204675.09284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204675.09290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204675.09305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204675.09331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204675.09448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204675.11629: stdout chunk (state=3): >>>/root <<< 44071 1727204675.11634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204675.11637: stdout chunk (state=3): >>><<< 44071 1727204675.11639: stderr chunk (state=3): >>><<< 44071 1727204675.11642: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204675.11646: _low_level_execute_command(): starting 44071 1727204675.11649: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204675.1152346-49026-139627643788490 `" && echo ansible-tmp-1727204675.1152346-49026-139627643788490="` echo /root/.ansible/tmp/ansible-tmp-1727204675.1152346-49026-139627643788490 `" ) && sleep 0' 44071 1727204675.12392: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204675.12397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204675.12407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204675.12410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204675.12413: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204675.12482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204675.12497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204675.12599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204675.14573: stdout chunk (state=3): >>>ansible-tmp-1727204675.1152346-49026-139627643788490=/root/.ansible/tmp/ansible-tmp-1727204675.1152346-49026-139627643788490 <<< 44071 1727204675.14797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204675.14880: stderr chunk (state=3): >>><<< 44071 1727204675.14900: stdout chunk (state=3): >>><<< 44071 1727204675.15036: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204675.1152346-49026-139627643788490=/root/.ansible/tmp/ansible-tmp-1727204675.1152346-49026-139627643788490 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204675.15083: variable 'ansible_module_compression' from source: unknown 44071 1727204675.15152: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204675.15210: variable 'ansible_facts' from source: unknown 44071 1727204675.15313: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204675.1152346-49026-139627643788490/AnsiballZ_command.py 44071 1727204675.15718: Sending initial data 44071 1727204675.15721: Sent initial data (156 bytes) 44071 1727204675.16819: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204675.16828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204675.16972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204675.17032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204675.17070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204675.17288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204675.17383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204675.19136: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204675.19373: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204675.19420: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp87lw6suy /root/.ansible/tmp/ansible-tmp-1727204675.1152346-49026-139627643788490/AnsiballZ_command.py <<< 44071 1727204675.19431: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204675.1152346-49026-139627643788490/AnsiballZ_command.py" <<< 44071 1727204675.19495: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp87lw6suy" to remote "/root/.ansible/tmp/ansible-tmp-1727204675.1152346-49026-139627643788490/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204675.1152346-49026-139627643788490/AnsiballZ_command.py" <<< 44071 1727204675.20843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204675.20880: stderr chunk (state=3): >>><<< 44071 1727204675.20884: stdout chunk (state=3): >>><<< 44071 1727204675.20909: done transferring module to remote 44071 1727204675.20922: _low_level_execute_command(): starting 44071 1727204675.20928: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204675.1152346-49026-139627643788490/ /root/.ansible/tmp/ansible-tmp-1727204675.1152346-49026-139627643788490/AnsiballZ_command.py && sleep 0' 44071 1727204675.21579: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204675.21588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204675.21599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204675.21615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204675.21626: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204675.21633: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204675.21651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204675.21659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204675.21669: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204675.21716: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204675.21720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204675.21722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204675.21725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204675.21727: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204675.21729: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204675.21731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204675.21796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204675.21817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204675.21820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204675.21922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204675.23953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204675.23957: stdout chunk (state=3): >>><<< 44071 1727204675.23960: stderr chunk (state=3): >>><<< 44071 1727204675.24086: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204675.24090: _low_level_execute_command(): starting 44071 1727204675.24093: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204675.1152346-49026-139627643788490/AnsiballZ_command.py && sleep 0' 44071 1727204675.26186: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204675.26390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204675.26520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204675.26577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204675.26689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204675.26908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204675.49139: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (beccd2e1-72f3-4d73-aac6-77978c2859f8) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:04:35.432426", "end": "2024-09-24 15:04:35.488647", "delta": "0:00:00.056221", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204675.51960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204675.52039: stderr chunk (state=3): >>><<< 44071 1727204675.52042: stdout chunk (state=3): >>><<< 44071 1727204675.52057: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Connection 'statebr' (beccd2e1-72f3-4d73-aac6-77978c2859f8) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:04:35.432426", "end": "2024-09-24 15:04:35.488647", "delta": "0:00:00.056221", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204675.52100: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204675.1152346-49026-139627643788490/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204675.52141: _low_level_execute_command(): starting 44071 1727204675.52145: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204675.1152346-49026-139627643788490/ > /dev/null 2>&1 && sleep 0' 44071 1727204675.52901: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204675.52920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204675.52952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204675.52999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204675.53096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204675.53120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204675.53148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204675.53171: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204675.53286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204675.55309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204675.55375: stderr chunk (state=3): >>><<< 44071 1727204675.55379: stdout chunk (state=3): >>><<< 44071 1727204675.55394: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204675.55401: handler run complete 44071 1727204675.55420: Evaluated conditional (False): False 44071 1727204675.55429: attempt loop complete, returning result 44071 1727204675.55434: _execute() done 44071 1727204675.55437: dumping result to json 44071 1727204675.55441: done dumping result, returning 44071 1727204675.55454: done running TaskExecutor() for managed-node2/TASK: Cleanup profile and device [127b8e07-fff9-c964-7471-0000000016ad] 44071 1727204675.55457: sending task result for task 127b8e07-fff9-c964-7471-0000000016ad 44071 1727204675.55571: done sending task result for task 127b8e07-fff9-c964-7471-0000000016ad 44071 1727204675.55574: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.056221", "end": "2024-09-24 15:04:35.488647", "rc": 0, "start": "2024-09-24 15:04:35.432426" } STDOUT: Connection 'statebr' (beccd2e1-72f3-4d73-aac6-77978c2859f8) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' 44071 1727204675.55647: no more pending results, returning what we have 44071 1727204675.55651: results queue empty 44071 1727204675.55651: checking for any_errors_fatal 44071 1727204675.55653: done checking for any_errors_fatal 44071 1727204675.55654: checking for max_fail_percentage 44071 1727204675.55655: done checking for max_fail_percentage 44071 1727204675.55656: checking to see if all hosts have failed and the running result is not ok 44071 1727204675.55657: done checking to see if all hosts have failed 44071 1727204675.55658: getting the remaining hosts for this loop 44071 1727204675.55659: done getting the remaining hosts for this loop 44071 1727204675.55664: getting the next task for host managed-node2 44071 1727204675.55684: done getting next task for host managed-node2 44071 1727204675.55687: ^ task is: TASK: Include the task 'run_test.yml' 44071 1727204675.55689: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204675.55694: getting variables 44071 1727204675.55695: in VariableManager get_vars() 44071 1727204675.55748: Calling all_inventory to load vars for managed-node2 44071 1727204675.55751: Calling groups_inventory to load vars for managed-node2 44071 1727204675.55755: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204675.55809: Calling all_plugins_play to load vars for managed-node2 44071 1727204675.55851: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204675.55856: Calling groups_plugins_play to load vars for managed-node2 44071 1727204675.57852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204675.60075: done with get_vars() 44071 1727204675.60115: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:102 Tuesday 24 September 2024 15:04:35 -0400 (0:00:00.549) 0:01:27.918 ***** 44071 1727204675.60228: entering _queue_task() for managed-node2/include_tasks 44071 1727204675.60641: worker is 1 (out of 1 available) 44071 1727204675.60655: exiting _queue_task() for managed-node2/include_tasks 44071 1727204675.60670: done queuing things up, now waiting for results queue to drain 44071 1727204675.60672: waiting for pending results... 44071 1727204675.61018: running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' 44071 1727204675.61108: in run() - task 127b8e07-fff9-c964-7471-000000000015 44071 1727204675.61117: variable 'ansible_search_path' from source: unknown 44071 1727204675.61153: calling self._execute() 44071 1727204675.61242: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.61247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.61256: variable 'omit' from source: magic vars 44071 1727204675.61588: variable 'ansible_distribution_major_version' from source: facts 44071 1727204675.61599: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204675.61607: _execute() done 44071 1727204675.61611: dumping result to json 44071 1727204675.61613: done dumping result, returning 44071 1727204675.61619: done running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' [127b8e07-fff9-c964-7471-000000000015] 44071 1727204675.61624: sending task result for task 127b8e07-fff9-c964-7471-000000000015 44071 1727204675.61745: done sending task result for task 127b8e07-fff9-c964-7471-000000000015 44071 1727204675.61748: WORKER PROCESS EXITING 44071 1727204675.61779: no more pending results, returning what we have 44071 1727204675.61785: in VariableManager get_vars() 44071 1727204675.61835: Calling all_inventory to load vars for managed-node2 44071 1727204675.61839: Calling groups_inventory to load vars for managed-node2 44071 1727204675.61842: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204675.61858: Calling all_plugins_play to load vars for managed-node2 44071 1727204675.61861: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204675.61864: Calling groups_plugins_play to load vars for managed-node2 44071 1727204675.62952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204675.64439: done with get_vars() 44071 1727204675.64468: variable 'ansible_search_path' from source: unknown 44071 1727204675.64484: we have included files to process 44071 1727204675.64487: generating all_blocks data 44071 1727204675.64490: done generating all_blocks data 44071 1727204675.64497: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204675.64499: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204675.64503: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204675.64953: in VariableManager get_vars() 44071 1727204675.64980: done with get_vars() 44071 1727204675.65031: in VariableManager get_vars() 44071 1727204675.65054: done with get_vars() 44071 1727204675.65106: in VariableManager get_vars() 44071 1727204675.65126: done with get_vars() 44071 1727204675.65180: in VariableManager get_vars() 44071 1727204675.65201: done with get_vars() 44071 1727204675.65238: in VariableManager get_vars() 44071 1727204675.65250: done with get_vars() 44071 1727204675.65685: in VariableManager get_vars() 44071 1727204675.65697: done with get_vars() 44071 1727204675.65706: done processing included file 44071 1727204675.65707: iterating over new_blocks loaded from include file 44071 1727204675.65708: in VariableManager get_vars() 44071 1727204675.65715: done with get_vars() 44071 1727204675.65716: filtering new block on tags 44071 1727204675.65821: done filtering new block on tags 44071 1727204675.65824: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node2 44071 1727204675.65831: extending task lists for all hosts with included blocks 44071 1727204675.65868: done extending task lists 44071 1727204675.65870: done processing included files 44071 1727204675.65870: results queue empty 44071 1727204675.65871: checking for any_errors_fatal 44071 1727204675.65878: done checking for any_errors_fatal 44071 1727204675.65879: checking for max_fail_percentage 44071 1727204675.65880: done checking for max_fail_percentage 44071 1727204675.65881: checking to see if all hosts have failed and the running result is not ok 44071 1727204675.65882: done checking to see if all hosts have failed 44071 1727204675.65882: getting the remaining hosts for this loop 44071 1727204675.65884: done getting the remaining hosts for this loop 44071 1727204675.65887: getting the next task for host managed-node2 44071 1727204675.65893: done getting next task for host managed-node2 44071 1727204675.65895: ^ task is: TASK: TEST: {{ lsr_description }} 44071 1727204675.65900: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204675.65902: getting variables 44071 1727204675.65903: in VariableManager get_vars() 44071 1727204675.65915: Calling all_inventory to load vars for managed-node2 44071 1727204675.65917: Calling groups_inventory to load vars for managed-node2 44071 1727204675.65921: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204675.65930: Calling all_plugins_play to load vars for managed-node2 44071 1727204675.65933: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204675.65937: Calling groups_plugins_play to load vars for managed-node2 44071 1727204675.67691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204675.70249: done with get_vars() 44071 1727204675.70390: done getting variables 44071 1727204675.70450: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204675.70595: variable 'lsr_description' from source: include params TASK [TEST: I can take a profile down that is absent] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 15:04:35 -0400 (0:00:00.103) 0:01:28.022 ***** 44071 1727204675.70634: entering _queue_task() for managed-node2/debug 44071 1727204675.71191: worker is 1 (out of 1 available) 44071 1727204675.71205: exiting _queue_task() for managed-node2/debug 44071 1727204675.71219: done queuing things up, now waiting for results queue to drain 44071 1727204675.71221: waiting for pending results... 44071 1727204675.71599: running TaskExecutor() for managed-node2/TASK: TEST: I can take a profile down that is absent 44071 1727204675.71686: in run() - task 127b8e07-fff9-c964-7471-000000001744 44071 1727204675.71690: variable 'ansible_search_path' from source: unknown 44071 1727204675.71696: variable 'ansible_search_path' from source: unknown 44071 1727204675.71699: calling self._execute() 44071 1727204675.71858: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.71899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.71905: variable 'omit' from source: magic vars 44071 1727204675.72419: variable 'ansible_distribution_major_version' from source: facts 44071 1727204675.72438: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204675.72442: variable 'omit' from source: magic vars 44071 1727204675.72495: variable 'omit' from source: magic vars 44071 1727204675.72623: variable 'lsr_description' from source: include params 44071 1727204675.72711: variable 'omit' from source: magic vars 44071 1727204675.72715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204675.72742: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204675.72774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204675.72805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.72810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.72918: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204675.72922: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.72931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.73201: Set connection var ansible_connection to ssh 44071 1727204675.73204: Set connection var ansible_timeout to 10 44071 1727204675.73206: Set connection var ansible_pipelining to False 44071 1727204675.73208: Set connection var ansible_shell_type to sh 44071 1727204675.73210: Set connection var ansible_shell_executable to /bin/sh 44071 1727204675.73238: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204675.73335: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.73345: variable 'ansible_connection' from source: unknown 44071 1727204675.73409: variable 'ansible_module_compression' from source: unknown 44071 1727204675.73413: variable 'ansible_shell_type' from source: unknown 44071 1727204675.73416: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.73418: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.73420: variable 'ansible_pipelining' from source: unknown 44071 1727204675.73422: variable 'ansible_timeout' from source: unknown 44071 1727204675.73425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.73768: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204675.73779: variable 'omit' from source: magic vars 44071 1727204675.73836: starting attempt loop 44071 1727204675.73839: running the handler 44071 1727204675.73859: handler run complete 44071 1727204675.73884: attempt loop complete, returning result 44071 1727204675.73892: _execute() done 44071 1727204675.73900: dumping result to json 44071 1727204675.73907: done dumping result, returning 44071 1727204675.73919: done running TaskExecutor() for managed-node2/TASK: TEST: I can take a profile down that is absent [127b8e07-fff9-c964-7471-000000001744] 44071 1727204675.73929: sending task result for task 127b8e07-fff9-c964-7471-000000001744 44071 1727204675.74210: done sending task result for task 127b8e07-fff9-c964-7471-000000001744 44071 1727204675.74215: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ########## I can take a profile down that is absent ########## 44071 1727204675.74283: no more pending results, returning what we have 44071 1727204675.74288: results queue empty 44071 1727204675.74288: checking for any_errors_fatal 44071 1727204675.74291: done checking for any_errors_fatal 44071 1727204675.74291: checking for max_fail_percentage 44071 1727204675.74293: done checking for max_fail_percentage 44071 1727204675.74294: checking to see if all hosts have failed and the running result is not ok 44071 1727204675.74294: done checking to see if all hosts have failed 44071 1727204675.74295: getting the remaining hosts for this loop 44071 1727204675.74297: done getting the remaining hosts for this loop 44071 1727204675.74302: getting the next task for host managed-node2 44071 1727204675.74311: done getting next task for host managed-node2 44071 1727204675.74314: ^ task is: TASK: Show item 44071 1727204675.74318: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204675.74322: getting variables 44071 1727204675.74323: in VariableManager get_vars() 44071 1727204675.74471: Calling all_inventory to load vars for managed-node2 44071 1727204675.74475: Calling groups_inventory to load vars for managed-node2 44071 1727204675.74480: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204675.74502: Calling all_plugins_play to load vars for managed-node2 44071 1727204675.74507: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204675.74512: Calling groups_plugins_play to load vars for managed-node2 44071 1727204675.78908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204675.84160: done with get_vars() 44071 1727204675.84363: done getting variables 44071 1727204675.84547: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 15:04:35 -0400 (0:00:00.139) 0:01:28.162 ***** 44071 1727204675.84593: entering _queue_task() for managed-node2/debug 44071 1727204675.85136: worker is 1 (out of 1 available) 44071 1727204675.85152: exiting _queue_task() for managed-node2/debug 44071 1727204675.85170: done queuing things up, now waiting for results queue to drain 44071 1727204675.85171: waiting for pending results... 44071 1727204675.85586: running TaskExecutor() for managed-node2/TASK: Show item 44071 1727204675.85595: in run() - task 127b8e07-fff9-c964-7471-000000001745 44071 1727204675.85633: variable 'ansible_search_path' from source: unknown 44071 1727204675.85645: variable 'ansible_search_path' from source: unknown 44071 1727204675.85742: variable 'omit' from source: magic vars 44071 1727204675.85926: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.85947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.86068: variable 'omit' from source: magic vars 44071 1727204675.86449: variable 'ansible_distribution_major_version' from source: facts 44071 1727204675.86467: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204675.86477: variable 'omit' from source: magic vars 44071 1727204675.86524: variable 'omit' from source: magic vars 44071 1727204675.86610: variable 'item' from source: unknown 44071 1727204675.86672: variable 'item' from source: unknown 44071 1727204675.86693: variable 'omit' from source: magic vars 44071 1727204675.86991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204675.86995: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204675.86998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204675.87014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.87042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.87104: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204675.87121: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.87129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.87269: Set connection var ansible_connection to ssh 44071 1727204675.87293: Set connection var ansible_timeout to 10 44071 1727204675.87305: Set connection var ansible_pipelining to False 44071 1727204675.87320: Set connection var ansible_shell_type to sh 44071 1727204675.87331: Set connection var ansible_shell_executable to /bin/sh 44071 1727204675.87347: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204675.87381: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.87411: variable 'ansible_connection' from source: unknown 44071 1727204675.87430: variable 'ansible_module_compression' from source: unknown 44071 1727204675.87498: variable 'ansible_shell_type' from source: unknown 44071 1727204675.87501: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.87504: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.87506: variable 'ansible_pipelining' from source: unknown 44071 1727204675.87508: variable 'ansible_timeout' from source: unknown 44071 1727204675.87511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.87656: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204675.87675: variable 'omit' from source: magic vars 44071 1727204675.87685: starting attempt loop 44071 1727204675.87692: running the handler 44071 1727204675.87967: variable 'lsr_description' from source: include params 44071 1727204675.87973: variable 'lsr_description' from source: include params 44071 1727204675.88052: handler run complete 44071 1727204675.88084: attempt loop complete, returning result 44071 1727204675.88106: variable 'item' from source: unknown 44071 1727204675.88479: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can take a profile down that is absent" } 44071 1727204675.88807: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.88810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.88813: variable 'omit' from source: magic vars 44071 1727204675.89188: variable 'ansible_distribution_major_version' from source: facts 44071 1727204675.89260: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204675.89292: variable 'omit' from source: magic vars 44071 1727204675.89418: variable 'omit' from source: magic vars 44071 1727204675.89481: variable 'item' from source: unknown 44071 1727204675.89605: variable 'item' from source: unknown 44071 1727204675.89669: variable 'omit' from source: magic vars 44071 1727204675.89708: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204675.89725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.89742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.89767: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204675.89784: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.89788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.90002: Set connection var ansible_connection to ssh 44071 1727204675.90006: Set connection var ansible_timeout to 10 44071 1727204675.90013: Set connection var ansible_pipelining to False 44071 1727204675.90016: Set connection var ansible_shell_type to sh 44071 1727204675.90019: Set connection var ansible_shell_executable to /bin/sh 44071 1727204675.90021: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204675.90023: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.90025: variable 'ansible_connection' from source: unknown 44071 1727204675.90029: variable 'ansible_module_compression' from source: unknown 44071 1727204675.90034: variable 'ansible_shell_type' from source: unknown 44071 1727204675.90036: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.90039: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.90041: variable 'ansible_pipelining' from source: unknown 44071 1727204675.90043: variable 'ansible_timeout' from source: unknown 44071 1727204675.90052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.90176: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204675.90192: variable 'omit' from source: magic vars 44071 1727204675.90202: starting attempt loop 44071 1727204675.90210: running the handler 44071 1727204675.90252: variable 'lsr_setup' from source: include params 44071 1727204675.90359: variable 'lsr_setup' from source: include params 44071 1727204675.90418: handler run complete 44071 1727204675.90549: attempt loop complete, returning result 44071 1727204675.90552: variable 'item' from source: unknown 44071 1727204675.90562: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove_profile.yml" ] } 44071 1727204675.90873: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.90876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.90883: variable 'omit' from source: magic vars 44071 1727204675.91010: variable 'ansible_distribution_major_version' from source: facts 44071 1727204675.91021: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204675.91030: variable 'omit' from source: magic vars 44071 1727204675.91055: variable 'omit' from source: magic vars 44071 1727204675.91114: variable 'item' from source: unknown 44071 1727204675.91197: variable 'item' from source: unknown 44071 1727204675.91228: variable 'omit' from source: magic vars 44071 1727204675.91260: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204675.91323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.91327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.91329: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204675.91334: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.91345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.91458: Set connection var ansible_connection to ssh 44071 1727204675.91472: Set connection var ansible_timeout to 10 44071 1727204675.91483: Set connection var ansible_pipelining to False 44071 1727204675.91492: Set connection var ansible_shell_type to sh 44071 1727204675.91543: Set connection var ansible_shell_executable to /bin/sh 44071 1727204675.91546: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204675.91549: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.91551: variable 'ansible_connection' from source: unknown 44071 1727204675.91558: variable 'ansible_module_compression' from source: unknown 44071 1727204675.91569: variable 'ansible_shell_type' from source: unknown 44071 1727204675.91578: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.91589: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.91607: variable 'ansible_pipelining' from source: unknown 44071 1727204675.91651: variable 'ansible_timeout' from source: unknown 44071 1727204675.91654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.91870: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204675.91875: variable 'omit' from source: magic vars 44071 1727204675.91881: starting attempt loop 44071 1727204675.91883: running the handler 44071 1727204675.91886: variable 'lsr_test' from source: include params 44071 1727204675.91950: variable 'lsr_test' from source: include params 44071 1727204675.91985: handler run complete 44071 1727204675.92006: attempt loop complete, returning result 44071 1727204675.92030: variable 'item' from source: unknown 44071 1727204675.92114: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 44071 1727204675.92410: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.92415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.92418: variable 'omit' from source: magic vars 44071 1727204675.92655: variable 'ansible_distribution_major_version' from source: facts 44071 1727204675.92659: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204675.92661: variable 'omit' from source: magic vars 44071 1727204675.92668: variable 'omit' from source: magic vars 44071 1727204675.92671: variable 'item' from source: unknown 44071 1727204675.92734: variable 'item' from source: unknown 44071 1727204675.92991: variable 'omit' from source: magic vars 44071 1727204675.92995: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204675.92998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.93000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.93002: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204675.93004: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.93006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.93107: Set connection var ansible_connection to ssh 44071 1727204675.93122: Set connection var ansible_timeout to 10 44071 1727204675.93135: Set connection var ansible_pipelining to False 44071 1727204675.93146: Set connection var ansible_shell_type to sh 44071 1727204675.93175: Set connection var ansible_shell_executable to /bin/sh 44071 1727204675.93193: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204675.93226: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.93237: variable 'ansible_connection' from source: unknown 44071 1727204675.93245: variable 'ansible_module_compression' from source: unknown 44071 1727204675.93252: variable 'ansible_shell_type' from source: unknown 44071 1727204675.93259: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.93275: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.93285: variable 'ansible_pipelining' from source: unknown 44071 1727204675.93295: variable 'ansible_timeout' from source: unknown 44071 1727204675.93312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.93445: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204675.93521: variable 'omit' from source: magic vars 44071 1727204675.93529: starting attempt loop 44071 1727204675.93535: running the handler 44071 1727204675.93537: variable 'lsr_assert' from source: include params 44071 1727204675.93596: variable 'lsr_assert' from source: include params 44071 1727204675.93624: handler run complete 44071 1727204675.93655: attempt loop complete, returning result 44071 1727204675.93680: variable 'item' from source: unknown 44071 1727204675.93764: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml" ] } 44071 1727204675.94007: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.94011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.94013: variable 'omit' from source: magic vars 44071 1727204675.94250: variable 'ansible_distribution_major_version' from source: facts 44071 1727204675.94262: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204675.94279: variable 'omit' from source: magic vars 44071 1727204675.94315: variable 'omit' from source: magic vars 44071 1727204675.94409: variable 'item' from source: unknown 44071 1727204675.94474: variable 'item' from source: unknown 44071 1727204675.94495: variable 'omit' from source: magic vars 44071 1727204675.94537: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204675.94626: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.94629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.94634: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204675.94637: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.94640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.94714: Set connection var ansible_connection to ssh 44071 1727204675.94726: Set connection var ansible_timeout to 10 44071 1727204675.94774: Set connection var ansible_pipelining to False 44071 1727204675.94786: Set connection var ansible_shell_type to sh 44071 1727204675.94803: Set connection var ansible_shell_executable to /bin/sh 44071 1727204675.94817: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204675.94853: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.94861: variable 'ansible_connection' from source: unknown 44071 1727204675.94872: variable 'ansible_module_compression' from source: unknown 44071 1727204675.94880: variable 'ansible_shell_type' from source: unknown 44071 1727204675.94951: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.94954: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.94957: variable 'ansible_pipelining' from source: unknown 44071 1727204675.94959: variable 'ansible_timeout' from source: unknown 44071 1727204675.94961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.95044: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204675.95063: variable 'omit' from source: magic vars 44071 1727204675.95273: starting attempt loop 44071 1727204675.95276: running the handler 44071 1727204675.95280: variable 'lsr_assert_when' from source: include params 44071 1727204675.95384: variable 'lsr_assert_when' from source: include params 44071 1727204675.95619: variable 'network_provider' from source: set_fact 44071 1727204675.95725: handler run complete 44071 1727204675.95754: attempt loop complete, returning result 44071 1727204675.95943: variable 'item' from source: unknown 44071 1727204675.95947: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 44071 1727204675.96400: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.96403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.96407: variable 'omit' from source: magic vars 44071 1727204675.96857: variable 'ansible_distribution_major_version' from source: facts 44071 1727204675.96887: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204675.96897: variable 'omit' from source: magic vars 44071 1727204675.96918: variable 'omit' from source: magic vars 44071 1727204675.97045: variable 'item' from source: unknown 44071 1727204675.97376: variable 'item' from source: unknown 44071 1727204675.97486: variable 'omit' from source: magic vars 44071 1727204675.97489: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204675.97492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.97495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.97497: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204675.97499: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.97501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.97705: Set connection var ansible_connection to ssh 44071 1727204675.97719: Set connection var ansible_timeout to 10 44071 1727204675.97729: Set connection var ansible_pipelining to False 44071 1727204675.97742: Set connection var ansible_shell_type to sh 44071 1727204675.97753: Set connection var ansible_shell_executable to /bin/sh 44071 1727204675.97769: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204675.97798: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.97877: variable 'ansible_connection' from source: unknown 44071 1727204675.97885: variable 'ansible_module_compression' from source: unknown 44071 1727204675.97893: variable 'ansible_shell_type' from source: unknown 44071 1727204675.97900: variable 'ansible_shell_executable' from source: unknown 44071 1727204675.97908: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.97924: variable 'ansible_pipelining' from source: unknown 44071 1727204675.97935: variable 'ansible_timeout' from source: unknown 44071 1727204675.97945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.98179: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204675.98196: variable 'omit' from source: magic vars 44071 1727204675.98256: starting attempt loop 44071 1727204675.98264: running the handler 44071 1727204675.98293: variable 'lsr_fail_debug' from source: play vars 44071 1727204675.98446: variable 'lsr_fail_debug' from source: play vars 44071 1727204675.98686: handler run complete 44071 1727204675.98689: attempt loop complete, returning result 44071 1727204675.98692: variable 'item' from source: unknown 44071 1727204675.98835: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 44071 1727204675.99136: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204675.99139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204675.99141: variable 'omit' from source: magic vars 44071 1727204675.99772: variable 'ansible_distribution_major_version' from source: facts 44071 1727204675.99776: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204675.99778: variable 'omit' from source: magic vars 44071 1727204675.99780: variable 'omit' from source: magic vars 44071 1727204675.99782: variable 'item' from source: unknown 44071 1727204675.99971: variable 'item' from source: unknown 44071 1727204675.99974: variable 'omit' from source: magic vars 44071 1727204675.99977: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204675.99986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204675.99994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204676.00009: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204676.00012: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204676.00015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204676.00170: Set connection var ansible_connection to ssh 44071 1727204676.00174: Set connection var ansible_timeout to 10 44071 1727204676.00177: Set connection var ansible_pipelining to False 44071 1727204676.00179: Set connection var ansible_shell_type to sh 44071 1727204676.00182: Set connection var ansible_shell_executable to /bin/sh 44071 1727204676.00184: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204676.00186: variable 'ansible_shell_executable' from source: unknown 44071 1727204676.00187: variable 'ansible_connection' from source: unknown 44071 1727204676.00189: variable 'ansible_module_compression' from source: unknown 44071 1727204676.00191: variable 'ansible_shell_type' from source: unknown 44071 1727204676.00193: variable 'ansible_shell_executable' from source: unknown 44071 1727204676.00195: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204676.00197: variable 'ansible_pipelining' from source: unknown 44071 1727204676.00203: variable 'ansible_timeout' from source: unknown 44071 1727204676.00205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204676.00294: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204676.00302: variable 'omit' from source: magic vars 44071 1727204676.00306: starting attempt loop 44071 1727204676.00315: running the handler 44071 1727204676.00338: variable 'lsr_cleanup' from source: include params 44071 1727204676.00472: variable 'lsr_cleanup' from source: include params 44071 1727204676.00475: handler run complete 44071 1727204676.00478: attempt loop complete, returning result 44071 1727204676.00480: variable 'item' from source: unknown 44071 1727204676.00553: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 44071 1727204676.00662: dumping result to json 44071 1727204676.00768: done dumping result, returning 44071 1727204676.00773: done running TaskExecutor() for managed-node2/TASK: Show item [127b8e07-fff9-c964-7471-000000001745] 44071 1727204676.00777: sending task result for task 127b8e07-fff9-c964-7471-000000001745 44071 1727204676.00833: done sending task result for task 127b8e07-fff9-c964-7471-000000001745 44071 1727204676.00836: WORKER PROCESS EXITING 44071 1727204676.00905: no more pending results, returning what we have 44071 1727204676.00909: results queue empty 44071 1727204676.00910: checking for any_errors_fatal 44071 1727204676.00917: done checking for any_errors_fatal 44071 1727204676.00918: checking for max_fail_percentage 44071 1727204676.00920: done checking for max_fail_percentage 44071 1727204676.00921: checking to see if all hosts have failed and the running result is not ok 44071 1727204676.00922: done checking to see if all hosts have failed 44071 1727204676.00922: getting the remaining hosts for this loop 44071 1727204676.00924: done getting the remaining hosts for this loop 44071 1727204676.00929: getting the next task for host managed-node2 44071 1727204676.00940: done getting next task for host managed-node2 44071 1727204676.00944: ^ task is: TASK: Include the task 'show_interfaces.yml' 44071 1727204676.00947: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204676.00951: getting variables 44071 1727204676.00953: in VariableManager get_vars() 44071 1727204676.00999: Calling all_inventory to load vars for managed-node2 44071 1727204676.01003: Calling groups_inventory to load vars for managed-node2 44071 1727204676.01007: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204676.01022: Calling all_plugins_play to load vars for managed-node2 44071 1727204676.01025: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204676.01028: Calling groups_plugins_play to load vars for managed-node2 44071 1727204676.06349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204676.08871: done with get_vars() 44071 1727204676.08921: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 15:04:36 -0400 (0:00:00.244) 0:01:28.406 ***** 44071 1727204676.09042: entering _queue_task() for managed-node2/include_tasks 44071 1727204676.09542: worker is 1 (out of 1 available) 44071 1727204676.09562: exiting _queue_task() for managed-node2/include_tasks 44071 1727204676.09735: done queuing things up, now waiting for results queue to drain 44071 1727204676.09737: waiting for pending results... 44071 1727204676.10834: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 44071 1727204676.11274: in run() - task 127b8e07-fff9-c964-7471-000000001746 44071 1727204676.11279: variable 'ansible_search_path' from source: unknown 44071 1727204676.11283: variable 'ansible_search_path' from source: unknown 44071 1727204676.11286: calling self._execute() 44071 1727204676.11364: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204676.11419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204676.11433: variable 'omit' from source: magic vars 44071 1727204676.12357: variable 'ansible_distribution_major_version' from source: facts 44071 1727204676.12409: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204676.12418: _execute() done 44071 1727204676.12421: dumping result to json 44071 1727204676.12424: done dumping result, returning 44071 1727204676.12434: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-c964-7471-000000001746] 44071 1727204676.12442: sending task result for task 127b8e07-fff9-c964-7471-000000001746 44071 1727204676.12559: done sending task result for task 127b8e07-fff9-c964-7471-000000001746 44071 1727204676.12563: WORKER PROCESS EXITING 44071 1727204676.12604: no more pending results, returning what we have 44071 1727204676.12610: in VariableManager get_vars() 44071 1727204676.12668: Calling all_inventory to load vars for managed-node2 44071 1727204676.12672: Calling groups_inventory to load vars for managed-node2 44071 1727204676.12676: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204676.12696: Calling all_plugins_play to load vars for managed-node2 44071 1727204676.12700: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204676.12703: Calling groups_plugins_play to load vars for managed-node2 44071 1727204676.15521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204676.21116: done with get_vars() 44071 1727204676.21317: variable 'ansible_search_path' from source: unknown 44071 1727204676.21320: variable 'ansible_search_path' from source: unknown 44071 1727204676.21545: we have included files to process 44071 1727204676.21547: generating all_blocks data 44071 1727204676.21549: done generating all_blocks data 44071 1727204676.21555: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204676.21557: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204676.21561: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204676.21850: in VariableManager get_vars() 44071 1727204676.21881: done with get_vars() 44071 1727204676.22130: done processing included file 44071 1727204676.22135: iterating over new_blocks loaded from include file 44071 1727204676.22136: in VariableManager get_vars() 44071 1727204676.22155: done with get_vars() 44071 1727204676.22157: filtering new block on tags 44071 1727204676.22212: done filtering new block on tags 44071 1727204676.22215: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 44071 1727204676.22221: extending task lists for all hosts with included blocks 44071 1727204676.23038: done extending task lists 44071 1727204676.23040: done processing included files 44071 1727204676.23041: results queue empty 44071 1727204676.23042: checking for any_errors_fatal 44071 1727204676.23050: done checking for any_errors_fatal 44071 1727204676.23051: checking for max_fail_percentage 44071 1727204676.23053: done checking for max_fail_percentage 44071 1727204676.23053: checking to see if all hosts have failed and the running result is not ok 44071 1727204676.23054: done checking to see if all hosts have failed 44071 1727204676.23055: getting the remaining hosts for this loop 44071 1727204676.23056: done getting the remaining hosts for this loop 44071 1727204676.23272: getting the next task for host managed-node2 44071 1727204676.23280: done getting next task for host managed-node2 44071 1727204676.23283: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 44071 1727204676.23287: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204676.23290: getting variables 44071 1727204676.23291: in VariableManager get_vars() 44071 1727204676.23310: Calling all_inventory to load vars for managed-node2 44071 1727204676.23313: Calling groups_inventory to load vars for managed-node2 44071 1727204676.23316: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204676.23323: Calling all_plugins_play to load vars for managed-node2 44071 1727204676.23326: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204676.23329: Calling groups_plugins_play to load vars for managed-node2 44071 1727204676.26756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204676.31250: done with get_vars() 44071 1727204676.31312: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:04:36 -0400 (0:00:00.223) 0:01:28.630 ***** 44071 1727204676.31430: entering _queue_task() for managed-node2/include_tasks 44071 1727204676.32189: worker is 1 (out of 1 available) 44071 1727204676.32201: exiting _queue_task() for managed-node2/include_tasks 44071 1727204676.32215: done queuing things up, now waiting for results queue to drain 44071 1727204676.32217: waiting for pending results... 44071 1727204676.32903: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 44071 1727204676.32930: in run() - task 127b8e07-fff9-c964-7471-00000000176d 44071 1727204676.32935: variable 'ansible_search_path' from source: unknown 44071 1727204676.32939: variable 'ansible_search_path' from source: unknown 44071 1727204676.32947: calling self._execute() 44071 1727204676.32951: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204676.32954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204676.32959: variable 'omit' from source: magic vars 44071 1727204676.33772: variable 'ansible_distribution_major_version' from source: facts 44071 1727204676.33776: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204676.33780: _execute() done 44071 1727204676.33783: dumping result to json 44071 1727204676.33785: done dumping result, returning 44071 1727204676.33787: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-c964-7471-00000000176d] 44071 1727204676.33789: sending task result for task 127b8e07-fff9-c964-7471-00000000176d 44071 1727204676.33880: done sending task result for task 127b8e07-fff9-c964-7471-00000000176d 44071 1727204676.33884: WORKER PROCESS EXITING 44071 1727204676.33917: no more pending results, returning what we have 44071 1727204676.34040: in VariableManager get_vars() 44071 1727204676.34098: Calling all_inventory to load vars for managed-node2 44071 1727204676.34102: Calling groups_inventory to load vars for managed-node2 44071 1727204676.34107: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204676.34125: Calling all_plugins_play to load vars for managed-node2 44071 1727204676.34129: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204676.34136: Calling groups_plugins_play to load vars for managed-node2 44071 1727204676.39170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204676.43593: done with get_vars() 44071 1727204676.43639: variable 'ansible_search_path' from source: unknown 44071 1727204676.43641: variable 'ansible_search_path' from source: unknown 44071 1727204676.43836: we have included files to process 44071 1727204676.43838: generating all_blocks data 44071 1727204676.43840: done generating all_blocks data 44071 1727204676.43842: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204676.43843: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204676.43846: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204676.44715: done processing included file 44071 1727204676.44718: iterating over new_blocks loaded from include file 44071 1727204676.44720: in VariableManager get_vars() 44071 1727204676.44745: done with get_vars() 44071 1727204676.44747: filtering new block on tags 44071 1727204676.44868: done filtering new block on tags 44071 1727204676.44872: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 44071 1727204676.44879: extending task lists for all hosts with included blocks 44071 1727204676.45196: done extending task lists 44071 1727204676.45198: done processing included files 44071 1727204676.45198: results queue empty 44071 1727204676.45199: checking for any_errors_fatal 44071 1727204676.45203: done checking for any_errors_fatal 44071 1727204676.45204: checking for max_fail_percentage 44071 1727204676.45205: done checking for max_fail_percentage 44071 1727204676.45206: checking to see if all hosts have failed and the running result is not ok 44071 1727204676.45207: done checking to see if all hosts have failed 44071 1727204676.45208: getting the remaining hosts for this loop 44071 1727204676.45209: done getting the remaining hosts for this loop 44071 1727204676.45212: getting the next task for host managed-node2 44071 1727204676.45218: done getting next task for host managed-node2 44071 1727204676.45220: ^ task is: TASK: Gather current interface info 44071 1727204676.45225: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204676.45227: getting variables 44071 1727204676.45229: in VariableManager get_vars() 44071 1727204676.45248: Calling all_inventory to load vars for managed-node2 44071 1727204676.45251: Calling groups_inventory to load vars for managed-node2 44071 1727204676.45253: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204676.45261: Calling all_plugins_play to load vars for managed-node2 44071 1727204676.45264: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204676.45269: Calling groups_plugins_play to load vars for managed-node2 44071 1727204676.49349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204676.54626: done with get_vars() 44071 1727204676.54824: done getting variables 44071 1727204676.54882: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:04:36 -0400 (0:00:00.237) 0:01:28.868 ***** 44071 1727204676.55181: entering _queue_task() for managed-node2/command 44071 1727204676.56216: worker is 1 (out of 1 available) 44071 1727204676.56229: exiting _queue_task() for managed-node2/command 44071 1727204676.56572: done queuing things up, now waiting for results queue to drain 44071 1727204676.56575: waiting for pending results... 44071 1727204676.56987: running TaskExecutor() for managed-node2/TASK: Gather current interface info 44071 1727204676.57132: in run() - task 127b8e07-fff9-c964-7471-0000000017a8 44071 1727204676.57472: variable 'ansible_search_path' from source: unknown 44071 1727204676.57476: variable 'ansible_search_path' from source: unknown 44071 1727204676.57479: calling self._execute() 44071 1727204676.57848: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204676.57863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204676.57883: variable 'omit' from source: magic vars 44071 1727204676.58738: variable 'ansible_distribution_major_version' from source: facts 44071 1727204676.58761: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204676.58850: variable 'omit' from source: magic vars 44071 1727204676.59172: variable 'omit' from source: magic vars 44071 1727204676.59176: variable 'omit' from source: magic vars 44071 1727204676.59219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204676.59303: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204676.59333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204676.59571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204676.59575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204676.59578: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204676.59580: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204676.59583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204676.59795: Set connection var ansible_connection to ssh 44071 1727204676.59810: Set connection var ansible_timeout to 10 44071 1727204676.59882: Set connection var ansible_pipelining to False 44071 1727204676.59894: Set connection var ansible_shell_type to sh 44071 1727204676.59905: Set connection var ansible_shell_executable to /bin/sh 44071 1727204676.59919: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204676.59953: variable 'ansible_shell_executable' from source: unknown 44071 1727204676.60179: variable 'ansible_connection' from source: unknown 44071 1727204676.60183: variable 'ansible_module_compression' from source: unknown 44071 1727204676.60186: variable 'ansible_shell_type' from source: unknown 44071 1727204676.60189: variable 'ansible_shell_executable' from source: unknown 44071 1727204676.60191: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204676.60193: variable 'ansible_pipelining' from source: unknown 44071 1727204676.60195: variable 'ansible_timeout' from source: unknown 44071 1727204676.60197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204676.60361: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204676.60492: variable 'omit' from source: magic vars 44071 1727204676.60504: starting attempt loop 44071 1727204676.60512: running the handler 44071 1727204676.60536: _low_level_execute_command(): starting 44071 1727204676.60550: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204676.62504: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204676.62605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204676.62618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204676.63035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204676.63116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204676.64881: stdout chunk (state=3): >>>/root <<< 44071 1727204676.65016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204676.65153: stderr chunk (state=3): >>><<< 44071 1727204676.65379: stdout chunk (state=3): >>><<< 44071 1727204676.65605: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204676.65609: _low_level_execute_command(): starting 44071 1727204676.65612: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204676.655017-49221-117277932685530 `" && echo ansible-tmp-1727204676.655017-49221-117277932685530="` echo /root/.ansible/tmp/ansible-tmp-1727204676.655017-49221-117277932685530 `" ) && sleep 0' 44071 1727204676.67587: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204676.67592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204676.67621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204676.67647: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204676.67871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204676.67922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204676.67937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204676.68050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204676.70248: stdout chunk (state=3): >>>ansible-tmp-1727204676.655017-49221-117277932685530=/root/.ansible/tmp/ansible-tmp-1727204676.655017-49221-117277932685530 <<< 44071 1727204676.70271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204676.70423: stderr chunk (state=3): >>><<< 44071 1727204676.70427: stdout chunk (state=3): >>><<< 44071 1727204676.70436: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204676.655017-49221-117277932685530=/root/.ansible/tmp/ansible-tmp-1727204676.655017-49221-117277932685530 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204676.70671: variable 'ansible_module_compression' from source: unknown 44071 1727204676.70675: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204676.70723: variable 'ansible_facts' from source: unknown 44071 1727204676.71193: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204676.655017-49221-117277932685530/AnsiballZ_command.py 44071 1727204676.71280: Sending initial data 44071 1727204676.71482: Sent initial data (155 bytes) 44071 1727204676.72891: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204676.73019: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204676.73064: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204676.73128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204676.74735: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204676.74845: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204676.74971: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpcp0g2ds6 /root/.ansible/tmp/ansible-tmp-1727204676.655017-49221-117277932685530/AnsiballZ_command.py <<< 44071 1727204676.75080: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204676.655017-49221-117277932685530/AnsiballZ_command.py" <<< 44071 1727204676.75100: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 44071 1727204676.75132: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpcp0g2ds6" to remote "/root/.ansible/tmp/ansible-tmp-1727204676.655017-49221-117277932685530/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204676.655017-49221-117277932685530/AnsiballZ_command.py" <<< 44071 1727204676.77192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204676.77269: stderr chunk (state=3): >>><<< 44071 1727204676.77292: stdout chunk (state=3): >>><<< 44071 1727204676.77323: done transferring module to remote 44071 1727204676.77462: _low_level_execute_command(): starting 44071 1727204676.77572: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204676.655017-49221-117277932685530/ /root/.ansible/tmp/ansible-tmp-1727204676.655017-49221-117277932685530/AnsiballZ_command.py && sleep 0' 44071 1727204676.79002: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204676.79018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204676.79153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204676.79171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204676.79219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204676.79286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204676.84512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204676.84652: stderr chunk (state=3): >>><<< 44071 1727204676.84681: stdout chunk (state=3): >>><<< 44071 1727204676.84886: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204676.84890: _low_level_execute_command(): starting 44071 1727204676.84893: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204676.655017-49221-117277932685530/AnsiballZ_command.py && sleep 0' 44071 1727204676.86189: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204676.86206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204676.86334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204676.86609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204676.86695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204677.03619: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:04:37.030554", "end": "2024-09-24 15:04:37.034411", "delta": "0:00:00.003857", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204677.05211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204677.05228: stdout chunk (state=3): >>><<< 44071 1727204677.05242: stderr chunk (state=3): >>><<< 44071 1727204677.05293: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:04:37.030554", "end": "2024-09-24 15:04:37.034411", "delta": "0:00:00.003857", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204677.05611: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204676.655017-49221-117277932685530/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204677.05616: _low_level_execute_command(): starting 44071 1727204677.05619: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204676.655017-49221-117277932685530/ > /dev/null 2>&1 && sleep 0' 44071 1727204677.06526: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204677.06544: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204677.06583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204677.06683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204677.06704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204677.06720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204677.06748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204677.06841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204677.09273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204677.09277: stdout chunk (state=3): >>><<< 44071 1727204677.09280: stderr chunk (state=3): >>><<< 44071 1727204677.09283: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204677.09285: handler run complete 44071 1727204677.09287: Evaluated conditional (False): False 44071 1727204677.09289: attempt loop complete, returning result 44071 1727204677.09291: _execute() done 44071 1727204677.09294: dumping result to json 44071 1727204677.09296: done dumping result, returning 44071 1727204677.09298: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [127b8e07-fff9-c964-7471-0000000017a8] 44071 1727204677.09300: sending task result for task 127b8e07-fff9-c964-7471-0000000017a8 44071 1727204677.09391: done sending task result for task 127b8e07-fff9-c964-7471-0000000017a8 ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003857", "end": "2024-09-24 15:04:37.034411", "rc": 0, "start": "2024-09-24 15:04:37.030554" } STDOUT: bonding_masters eth0 lo 44071 1727204677.09488: no more pending results, returning what we have 44071 1727204677.09492: results queue empty 44071 1727204677.09493: checking for any_errors_fatal 44071 1727204677.09495: done checking for any_errors_fatal 44071 1727204677.09496: checking for max_fail_percentage 44071 1727204677.09498: done checking for max_fail_percentage 44071 1727204677.09499: checking to see if all hosts have failed and the running result is not ok 44071 1727204677.09500: done checking to see if all hosts have failed 44071 1727204677.09501: getting the remaining hosts for this loop 44071 1727204677.09502: done getting the remaining hosts for this loop 44071 1727204677.09508: getting the next task for host managed-node2 44071 1727204677.09520: done getting next task for host managed-node2 44071 1727204677.09523: ^ task is: TASK: Set current_interfaces 44071 1727204677.09530: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204677.09538: getting variables 44071 1727204677.09540: in VariableManager get_vars() 44071 1727204677.09807: Calling all_inventory to load vars for managed-node2 44071 1727204677.09811: Calling groups_inventory to load vars for managed-node2 44071 1727204677.09815: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204677.09828: Calling all_plugins_play to load vars for managed-node2 44071 1727204677.09834: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204677.09838: Calling groups_plugins_play to load vars for managed-node2 44071 1727204677.10459: WORKER PROCESS EXITING 44071 1727204677.12185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204677.15445: done with get_vars() 44071 1727204677.15579: done getting variables 44071 1727204677.15660: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:04:37 -0400 (0:00:00.605) 0:01:29.473 ***** 44071 1727204677.15746: entering _queue_task() for managed-node2/set_fact 44071 1727204677.16375: worker is 1 (out of 1 available) 44071 1727204677.16392: exiting _queue_task() for managed-node2/set_fact 44071 1727204677.16409: done queuing things up, now waiting for results queue to drain 44071 1727204677.16411: waiting for pending results... 44071 1727204677.17189: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 44071 1727204677.17575: in run() - task 127b8e07-fff9-c964-7471-0000000017a9 44071 1727204677.17581: variable 'ansible_search_path' from source: unknown 44071 1727204677.17586: variable 'ansible_search_path' from source: unknown 44071 1727204677.17593: calling self._execute() 44071 1727204677.17841: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204677.17858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204677.18072: variable 'omit' from source: magic vars 44071 1727204677.18750: variable 'ansible_distribution_major_version' from source: facts 44071 1727204677.19072: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204677.19077: variable 'omit' from source: magic vars 44071 1727204677.19080: variable 'omit' from source: magic vars 44071 1727204677.19220: variable '_current_interfaces' from source: set_fact 44071 1727204677.19446: variable 'omit' from source: magic vars 44071 1727204677.19531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204677.19715: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204677.19893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204677.19971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204677.19975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204677.19978: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204677.20079: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204677.20089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204677.20371: Set connection var ansible_connection to ssh 44071 1727204677.20375: Set connection var ansible_timeout to 10 44071 1727204677.20377: Set connection var ansible_pipelining to False 44071 1727204677.20379: Set connection var ansible_shell_type to sh 44071 1727204677.20381: Set connection var ansible_shell_executable to /bin/sh 44071 1727204677.20383: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204677.20592: variable 'ansible_shell_executable' from source: unknown 44071 1727204677.20603: variable 'ansible_connection' from source: unknown 44071 1727204677.20610: variable 'ansible_module_compression' from source: unknown 44071 1727204677.20616: variable 'ansible_shell_type' from source: unknown 44071 1727204677.20623: variable 'ansible_shell_executable' from source: unknown 44071 1727204677.20629: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204677.20636: variable 'ansible_pipelining' from source: unknown 44071 1727204677.20642: variable 'ansible_timeout' from source: unknown 44071 1727204677.20650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204677.21145: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204677.21149: variable 'omit' from source: magic vars 44071 1727204677.21152: starting attempt loop 44071 1727204677.21155: running the handler 44071 1727204677.21157: handler run complete 44071 1727204677.21160: attempt loop complete, returning result 44071 1727204677.21162: _execute() done 44071 1727204677.21164: dumping result to json 44071 1727204677.21170: done dumping result, returning 44071 1727204677.21173: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [127b8e07-fff9-c964-7471-0000000017a9] 44071 1727204677.21175: sending task result for task 127b8e07-fff9-c964-7471-0000000017a9 44071 1727204677.21255: done sending task result for task 127b8e07-fff9-c964-7471-0000000017a9 44071 1727204677.21260: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 44071 1727204677.21346: no more pending results, returning what we have 44071 1727204677.21350: results queue empty 44071 1727204677.21351: checking for any_errors_fatal 44071 1727204677.21363: done checking for any_errors_fatal 44071 1727204677.21364: checking for max_fail_percentage 44071 1727204677.21367: done checking for max_fail_percentage 44071 1727204677.21368: checking to see if all hosts have failed and the running result is not ok 44071 1727204677.21369: done checking to see if all hosts have failed 44071 1727204677.21370: getting the remaining hosts for this loop 44071 1727204677.21372: done getting the remaining hosts for this loop 44071 1727204677.21378: getting the next task for host managed-node2 44071 1727204677.21389: done getting next task for host managed-node2 44071 1727204677.21392: ^ task is: TASK: Show current_interfaces 44071 1727204677.21397: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204677.21402: getting variables 44071 1727204677.21404: in VariableManager get_vars() 44071 1727204677.21648: Calling all_inventory to load vars for managed-node2 44071 1727204677.21652: Calling groups_inventory to load vars for managed-node2 44071 1727204677.21656: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204677.21704: Calling all_plugins_play to load vars for managed-node2 44071 1727204677.21708: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204677.21713: Calling groups_plugins_play to load vars for managed-node2 44071 1727204677.26450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204677.33215: done with get_vars() 44071 1727204677.33353: done getting variables 44071 1727204677.33586: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:04:37 -0400 (0:00:00.178) 0:01:29.652 ***** 44071 1727204677.33785: entering _queue_task() for managed-node2/debug 44071 1727204677.34602: worker is 1 (out of 1 available) 44071 1727204677.34618: exiting _queue_task() for managed-node2/debug 44071 1727204677.34636: done queuing things up, now waiting for results queue to drain 44071 1727204677.34638: waiting for pending results... 44071 1727204677.35286: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 44071 1727204677.35398: in run() - task 127b8e07-fff9-c964-7471-00000000176e 44071 1727204677.35673: variable 'ansible_search_path' from source: unknown 44071 1727204677.35677: variable 'ansible_search_path' from source: unknown 44071 1727204677.35680: calling self._execute() 44071 1727204677.36073: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204677.36079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204677.36083: variable 'omit' from source: magic vars 44071 1727204677.36993: variable 'ansible_distribution_major_version' from source: facts 44071 1727204677.37024: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204677.37040: variable 'omit' from source: magic vars 44071 1727204677.37497: variable 'omit' from source: magic vars 44071 1727204677.37504: variable 'current_interfaces' from source: set_fact 44071 1727204677.38072: variable 'omit' from source: magic vars 44071 1727204677.38077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204677.38080: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204677.38082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204677.38085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204677.38087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204677.38187: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204677.38289: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204677.38299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204677.38443: Set connection var ansible_connection to ssh 44071 1727204677.38583: Set connection var ansible_timeout to 10 44071 1727204677.38595: Set connection var ansible_pipelining to False 44071 1727204677.38656: Set connection var ansible_shell_type to sh 44071 1727204677.38669: Set connection var ansible_shell_executable to /bin/sh 44071 1727204677.38684: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204677.38806: variable 'ansible_shell_executable' from source: unknown 44071 1727204677.38821: variable 'ansible_connection' from source: unknown 44071 1727204677.38831: variable 'ansible_module_compression' from source: unknown 44071 1727204677.38839: variable 'ansible_shell_type' from source: unknown 44071 1727204677.38847: variable 'ansible_shell_executable' from source: unknown 44071 1727204677.38854: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204677.38861: variable 'ansible_pipelining' from source: unknown 44071 1727204677.38906: variable 'ansible_timeout' from source: unknown 44071 1727204677.38917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204677.39453: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204677.39458: variable 'omit' from source: magic vars 44071 1727204677.39461: starting attempt loop 44071 1727204677.39464: running the handler 44071 1727204677.39572: handler run complete 44071 1727204677.39603: attempt loop complete, returning result 44071 1727204677.39656: _execute() done 44071 1727204677.39664: dumping result to json 44071 1727204677.39674: done dumping result, returning 44071 1727204677.39686: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [127b8e07-fff9-c964-7471-00000000176e] 44071 1727204677.39695: sending task result for task 127b8e07-fff9-c964-7471-00000000176e 44071 1727204677.40110: done sending task result for task 127b8e07-fff9-c964-7471-00000000176e 44071 1727204677.40117: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 44071 1727204677.40177: no more pending results, returning what we have 44071 1727204677.40180: results queue empty 44071 1727204677.40181: checking for any_errors_fatal 44071 1727204677.40190: done checking for any_errors_fatal 44071 1727204677.40191: checking for max_fail_percentage 44071 1727204677.40194: done checking for max_fail_percentage 44071 1727204677.40195: checking to see if all hosts have failed and the running result is not ok 44071 1727204677.40195: done checking to see if all hosts have failed 44071 1727204677.40196: getting the remaining hosts for this loop 44071 1727204677.40198: done getting the remaining hosts for this loop 44071 1727204677.40202: getting the next task for host managed-node2 44071 1727204677.40212: done getting next task for host managed-node2 44071 1727204677.40216: ^ task is: TASK: Setup 44071 1727204677.40221: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204677.40225: getting variables 44071 1727204677.40226: in VariableManager get_vars() 44071 1727204677.40273: Calling all_inventory to load vars for managed-node2 44071 1727204677.40276: Calling groups_inventory to load vars for managed-node2 44071 1727204677.40280: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204677.40293: Calling all_plugins_play to load vars for managed-node2 44071 1727204677.40296: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204677.40298: Calling groups_plugins_play to load vars for managed-node2 44071 1727204677.63759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204677.66262: done with get_vars() 44071 1727204677.66313: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 15:04:37 -0400 (0:00:00.332) 0:01:29.985 ***** 44071 1727204677.66869: entering _queue_task() for managed-node2/include_tasks 44071 1727204677.67305: worker is 1 (out of 1 available) 44071 1727204677.67320: exiting _queue_task() for managed-node2/include_tasks 44071 1727204677.67583: done queuing things up, now waiting for results queue to drain 44071 1727204677.67585: waiting for pending results... 44071 1727204677.67693: running TaskExecutor() for managed-node2/TASK: Setup 44071 1727204677.67962: in run() - task 127b8e07-fff9-c964-7471-000000001747 44071 1727204677.68011: variable 'ansible_search_path' from source: unknown 44071 1727204677.68140: variable 'ansible_search_path' from source: unknown 44071 1727204677.68144: variable 'lsr_setup' from source: include params 44071 1727204677.68705: variable 'lsr_setup' from source: include params 44071 1727204677.68925: variable 'omit' from source: magic vars 44071 1727204677.69340: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204677.69345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204677.69413: variable 'omit' from source: magic vars 44071 1727204677.69925: variable 'ansible_distribution_major_version' from source: facts 44071 1727204677.69945: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204677.69967: variable 'item' from source: unknown 44071 1727204677.70055: variable 'item' from source: unknown 44071 1727204677.70120: variable 'item' from source: unknown 44071 1727204677.70216: variable 'item' from source: unknown 44071 1727204677.70626: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204677.70629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204677.70632: variable 'omit' from source: magic vars 44071 1727204677.70784: variable 'ansible_distribution_major_version' from source: facts 44071 1727204677.70836: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204677.70844: variable 'item' from source: unknown 44071 1727204677.70903: variable 'item' from source: unknown 44071 1727204677.70970: variable 'item' from source: unknown 44071 1727204677.71074: variable 'item' from source: unknown 44071 1727204677.71340: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204677.71344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204677.71347: variable 'omit' from source: magic vars 44071 1727204677.71484: variable 'ansible_distribution_major_version' from source: facts 44071 1727204677.71560: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204677.71564: variable 'item' from source: unknown 44071 1727204677.71593: variable 'item' from source: unknown 44071 1727204677.71640: variable 'item' from source: unknown 44071 1727204677.71722: variable 'item' from source: unknown 44071 1727204677.71928: dumping result to json 44071 1727204677.71935: done dumping result, returning 44071 1727204677.71938: done running TaskExecutor() for managed-node2/TASK: Setup [127b8e07-fff9-c964-7471-000000001747] 44071 1727204677.71942: sending task result for task 127b8e07-fff9-c964-7471-000000001747 44071 1727204677.71995: done sending task result for task 127b8e07-fff9-c964-7471-000000001747 44071 1727204677.71998: WORKER PROCESS EXITING 44071 1727204677.72081: no more pending results, returning what we have 44071 1727204677.72087: in VariableManager get_vars() 44071 1727204677.72269: Calling all_inventory to load vars for managed-node2 44071 1727204677.72273: Calling groups_inventory to load vars for managed-node2 44071 1727204677.72277: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204677.72293: Calling all_plugins_play to load vars for managed-node2 44071 1727204677.72296: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204677.72300: Calling groups_plugins_play to load vars for managed-node2 44071 1727204677.75439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204677.82735: done with get_vars() 44071 1727204677.82780: variable 'ansible_search_path' from source: unknown 44071 1727204677.82781: variable 'ansible_search_path' from source: unknown 44071 1727204677.82957: variable 'ansible_search_path' from source: unknown 44071 1727204677.82958: variable 'ansible_search_path' from source: unknown 44071 1727204677.82996: variable 'ansible_search_path' from source: unknown 44071 1727204677.82998: variable 'ansible_search_path' from source: unknown 44071 1727204677.83153: we have included files to process 44071 1727204677.83155: generating all_blocks data 44071 1727204677.83157: done generating all_blocks data 44071 1727204677.83162: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 44071 1727204677.83164: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 44071 1727204677.83169: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 44071 1727204677.83819: done processing included file 44071 1727204677.83822: iterating over new_blocks loaded from include file 44071 1727204677.83823: in VariableManager get_vars() 44071 1727204677.83844: done with get_vars() 44071 1727204677.83846: filtering new block on tags 44071 1727204677.84010: done filtering new block on tags 44071 1727204677.84019: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed-node2 => (item=tasks/create_bridge_profile.yml) 44071 1727204677.84029: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 44071 1727204677.84031: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 44071 1727204677.84034: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 44071 1727204677.84348: done processing included file 44071 1727204677.84354: iterating over new_blocks loaded from include file 44071 1727204677.84356: in VariableManager get_vars() 44071 1727204677.84380: done with get_vars() 44071 1727204677.84383: filtering new block on tags 44071 1727204677.84408: done filtering new block on tags 44071 1727204677.84410: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed-node2 => (item=tasks/activate_profile.yml) 44071 1727204677.84415: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 44071 1727204677.84416: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 44071 1727204677.84420: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 44071 1727204677.84773: done processing included file 44071 1727204677.84779: iterating over new_blocks loaded from include file 44071 1727204677.84781: in VariableManager get_vars() 44071 1727204677.84805: done with get_vars() 44071 1727204677.84807: filtering new block on tags 44071 1727204677.84833: done filtering new block on tags 44071 1727204677.84835: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed-node2 => (item=tasks/remove_profile.yml) 44071 1727204677.84840: extending task lists for all hosts with included blocks 44071 1727204677.86146: done extending task lists 44071 1727204677.86154: done processing included files 44071 1727204677.86155: results queue empty 44071 1727204677.86156: checking for any_errors_fatal 44071 1727204677.86161: done checking for any_errors_fatal 44071 1727204677.86162: checking for max_fail_percentage 44071 1727204677.86168: done checking for max_fail_percentage 44071 1727204677.86169: checking to see if all hosts have failed and the running result is not ok 44071 1727204677.86170: done checking to see if all hosts have failed 44071 1727204677.86171: getting the remaining hosts for this loop 44071 1727204677.86172: done getting the remaining hosts for this loop 44071 1727204677.86177: getting the next task for host managed-node2 44071 1727204677.86182: done getting next task for host managed-node2 44071 1727204677.86185: ^ task is: TASK: Include network role 44071 1727204677.86188: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204677.86190: getting variables 44071 1727204677.86192: in VariableManager get_vars() 44071 1727204677.86211: Calling all_inventory to load vars for managed-node2 44071 1727204677.86214: Calling groups_inventory to load vars for managed-node2 44071 1727204677.86217: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204677.86224: Calling all_plugins_play to load vars for managed-node2 44071 1727204677.86231: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204677.86236: Calling groups_plugins_play to load vars for managed-node2 44071 1727204677.88925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204677.92331: done with get_vars() 44071 1727204677.92377: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Tuesday 24 September 2024 15:04:37 -0400 (0:00:00.256) 0:01:30.241 ***** 44071 1727204677.92483: entering _queue_task() for managed-node2/include_role 44071 1727204677.93084: worker is 1 (out of 1 available) 44071 1727204677.93098: exiting _queue_task() for managed-node2/include_role 44071 1727204677.93111: done queuing things up, now waiting for results queue to drain 44071 1727204677.93113: waiting for pending results... 44071 1727204677.93522: running TaskExecutor() for managed-node2/TASK: Include network role 44071 1727204677.93658: in run() - task 127b8e07-fff9-c964-7471-0000000017d0 44071 1727204677.93708: variable 'ansible_search_path' from source: unknown 44071 1727204677.93713: variable 'ansible_search_path' from source: unknown 44071 1727204677.93808: calling self._execute() 44071 1727204677.93900: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204677.93919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204677.93941: variable 'omit' from source: magic vars 44071 1727204677.94808: variable 'ansible_distribution_major_version' from source: facts 44071 1727204677.94813: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204677.94816: _execute() done 44071 1727204677.94820: dumping result to json 44071 1727204677.94823: done dumping result, returning 44071 1727204677.94826: done running TaskExecutor() for managed-node2/TASK: Include network role [127b8e07-fff9-c964-7471-0000000017d0] 44071 1727204677.94828: sending task result for task 127b8e07-fff9-c964-7471-0000000017d0 44071 1727204677.94932: done sending task result for task 127b8e07-fff9-c964-7471-0000000017d0 44071 1727204677.94936: WORKER PROCESS EXITING 44071 1727204677.94978: no more pending results, returning what we have 44071 1727204677.94985: in VariableManager get_vars() 44071 1727204677.95037: Calling all_inventory to load vars for managed-node2 44071 1727204677.95040: Calling groups_inventory to load vars for managed-node2 44071 1727204677.95161: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204677.95183: Calling all_plugins_play to load vars for managed-node2 44071 1727204677.95186: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204677.95190: Calling groups_plugins_play to load vars for managed-node2 44071 1727204677.99092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204678.01578: done with get_vars() 44071 1727204678.01616: variable 'ansible_search_path' from source: unknown 44071 1727204678.01617: variable 'ansible_search_path' from source: unknown 44071 1727204678.01862: variable 'omit' from source: magic vars 44071 1727204678.01920: variable 'omit' from source: magic vars 44071 1727204678.01937: variable 'omit' from source: magic vars 44071 1727204678.01942: we have included files to process 44071 1727204678.01943: generating all_blocks data 44071 1727204678.01945: done generating all_blocks data 44071 1727204678.01947: processing included file: fedora.linux_system_roles.network 44071 1727204678.01975: in VariableManager get_vars() 44071 1727204678.02016: done with get_vars() 44071 1727204678.02048: in VariableManager get_vars() 44071 1727204678.02070: done with get_vars() 44071 1727204678.02158: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44071 1727204678.02323: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44071 1727204678.02431: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44071 1727204678.03863: in VariableManager get_vars() 44071 1727204678.03895: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204678.07357: iterating over new_blocks loaded from include file 44071 1727204678.07361: in VariableManager get_vars() 44071 1727204678.07390: done with get_vars() 44071 1727204678.07393: filtering new block on tags 44071 1727204678.07729: done filtering new block on tags 44071 1727204678.07734: in VariableManager get_vars() 44071 1727204678.07754: done with get_vars() 44071 1727204678.07756: filtering new block on tags 44071 1727204678.07787: done filtering new block on tags 44071 1727204678.07790: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 44071 1727204678.07796: extending task lists for all hosts with included blocks 44071 1727204678.08088: done extending task lists 44071 1727204678.08090: done processing included files 44071 1727204678.08091: results queue empty 44071 1727204678.08092: checking for any_errors_fatal 44071 1727204678.08096: done checking for any_errors_fatal 44071 1727204678.08137: checking for max_fail_percentage 44071 1727204678.08139: done checking for max_fail_percentage 44071 1727204678.08140: checking to see if all hosts have failed and the running result is not ok 44071 1727204678.08141: done checking to see if all hosts have failed 44071 1727204678.08142: getting the remaining hosts for this loop 44071 1727204678.08143: done getting the remaining hosts for this loop 44071 1727204678.08147: getting the next task for host managed-node2 44071 1727204678.08152: done getting next task for host managed-node2 44071 1727204678.08158: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204678.08162: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204678.08178: getting variables 44071 1727204678.08179: in VariableManager get_vars() 44071 1727204678.08202: Calling all_inventory to load vars for managed-node2 44071 1727204678.08205: Calling groups_inventory to load vars for managed-node2 44071 1727204678.08235: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204678.08244: Calling all_plugins_play to load vars for managed-node2 44071 1727204678.08247: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204678.08250: Calling groups_plugins_play to load vars for managed-node2 44071 1727204678.10205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204678.14218: done with get_vars() 44071 1727204678.14275: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:04:38 -0400 (0:00:00.219) 0:01:30.460 ***** 44071 1727204678.14396: entering _queue_task() for managed-node2/include_tasks 44071 1727204678.14923: worker is 1 (out of 1 available) 44071 1727204678.14938: exiting _queue_task() for managed-node2/include_tasks 44071 1727204678.14953: done queuing things up, now waiting for results queue to drain 44071 1727204678.14955: waiting for pending results... 44071 1727204678.15469: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204678.15476: in run() - task 127b8e07-fff9-c964-7471-00000000183a 44071 1727204678.15480: variable 'ansible_search_path' from source: unknown 44071 1727204678.15482: variable 'ansible_search_path' from source: unknown 44071 1727204678.15563: calling self._execute() 44071 1727204678.15645: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204678.15660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204678.15685: variable 'omit' from source: magic vars 44071 1727204678.16511: variable 'ansible_distribution_major_version' from source: facts 44071 1727204678.16514: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204678.16518: _execute() done 44071 1727204678.16521: dumping result to json 44071 1727204678.16523: done dumping result, returning 44071 1727204678.16527: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-c964-7471-00000000183a] 44071 1727204678.16529: sending task result for task 127b8e07-fff9-c964-7471-00000000183a 44071 1727204678.16737: no more pending results, returning what we have 44071 1727204678.16744: in VariableManager get_vars() 44071 1727204678.16803: Calling all_inventory to load vars for managed-node2 44071 1727204678.16807: Calling groups_inventory to load vars for managed-node2 44071 1727204678.16809: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204678.16826: Calling all_plugins_play to load vars for managed-node2 44071 1727204678.16829: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204678.16833: Calling groups_plugins_play to load vars for managed-node2 44071 1727204678.17439: done sending task result for task 127b8e07-fff9-c964-7471-00000000183a 44071 1727204678.17443: WORKER PROCESS EXITING 44071 1727204678.19350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204678.22524: done with get_vars() 44071 1727204678.22650: variable 'ansible_search_path' from source: unknown 44071 1727204678.22652: variable 'ansible_search_path' from source: unknown 44071 1727204678.22726: we have included files to process 44071 1727204678.22728: generating all_blocks data 44071 1727204678.22731: done generating all_blocks data 44071 1727204678.22768: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204678.22770: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204678.22774: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204678.24156: done processing included file 44071 1727204678.24159: iterating over new_blocks loaded from include file 44071 1727204678.24161: in VariableManager get_vars() 44071 1727204678.24257: done with get_vars() 44071 1727204678.24260: filtering new block on tags 44071 1727204678.24299: done filtering new block on tags 44071 1727204678.24303: in VariableManager get_vars() 44071 1727204678.24335: done with get_vars() 44071 1727204678.24338: filtering new block on tags 44071 1727204678.24397: done filtering new block on tags 44071 1727204678.24401: in VariableManager get_vars() 44071 1727204678.24441: done with get_vars() 44071 1727204678.24443: filtering new block on tags 44071 1727204678.24520: done filtering new block on tags 44071 1727204678.24526: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 44071 1727204678.24576: extending task lists for all hosts with included blocks 44071 1727204678.27027: done extending task lists 44071 1727204678.27029: done processing included files 44071 1727204678.27029: results queue empty 44071 1727204678.27030: checking for any_errors_fatal 44071 1727204678.27034: done checking for any_errors_fatal 44071 1727204678.27035: checking for max_fail_percentage 44071 1727204678.27036: done checking for max_fail_percentage 44071 1727204678.27037: checking to see if all hosts have failed and the running result is not ok 44071 1727204678.27038: done checking to see if all hosts have failed 44071 1727204678.27039: getting the remaining hosts for this loop 44071 1727204678.27041: done getting the remaining hosts for this loop 44071 1727204678.27045: getting the next task for host managed-node2 44071 1727204678.27052: done getting next task for host managed-node2 44071 1727204678.27055: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204678.27062: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204678.27078: getting variables 44071 1727204678.27080: in VariableManager get_vars() 44071 1727204678.27096: Calling all_inventory to load vars for managed-node2 44071 1727204678.27098: Calling groups_inventory to load vars for managed-node2 44071 1727204678.27101: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204678.27108: Calling all_plugins_play to load vars for managed-node2 44071 1727204678.27111: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204678.27114: Calling groups_plugins_play to load vars for managed-node2 44071 1727204678.28362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204678.30390: done with get_vars() 44071 1727204678.30444: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:04:38 -0400 (0:00:00.161) 0:01:30.622 ***** 44071 1727204678.30564: entering _queue_task() for managed-node2/setup 44071 1727204678.31280: worker is 1 (out of 1 available) 44071 1727204678.31293: exiting _queue_task() for managed-node2/setup 44071 1727204678.31307: done queuing things up, now waiting for results queue to drain 44071 1727204678.31309: waiting for pending results... 44071 1727204678.31510: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204678.31697: in run() - task 127b8e07-fff9-c964-7471-000000001897 44071 1727204678.31710: variable 'ansible_search_path' from source: unknown 44071 1727204678.31714: variable 'ansible_search_path' from source: unknown 44071 1727204678.31784: calling self._execute() 44071 1727204678.31915: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204678.31919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204678.31923: variable 'omit' from source: magic vars 44071 1727204678.32442: variable 'ansible_distribution_major_version' from source: facts 44071 1727204678.32446: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204678.32738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204678.36421: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204678.36786: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204678.36831: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204678.37080: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204678.37110: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204678.37445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204678.37480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204678.37510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204678.37555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204678.37770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204678.37776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204678.37871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204678.37875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204678.37879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204678.37882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204678.38057: variable '__network_required_facts' from source: role '' defaults 44071 1727204678.38075: variable 'ansible_facts' from source: unknown 44071 1727204678.39357: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44071 1727204678.39362: when evaluation is False, skipping this task 44071 1727204678.39367: _execute() done 44071 1727204678.39370: dumping result to json 44071 1727204678.39372: done dumping result, returning 44071 1727204678.39396: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-c964-7471-000000001897] 44071 1727204678.39400: sending task result for task 127b8e07-fff9-c964-7471-000000001897 44071 1727204678.39654: done sending task result for task 127b8e07-fff9-c964-7471-000000001897 44071 1727204678.39658: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204678.39721: no more pending results, returning what we have 44071 1727204678.39725: results queue empty 44071 1727204678.39726: checking for any_errors_fatal 44071 1727204678.39728: done checking for any_errors_fatal 44071 1727204678.39729: checking for max_fail_percentage 44071 1727204678.39731: done checking for max_fail_percentage 44071 1727204678.39734: checking to see if all hosts have failed and the running result is not ok 44071 1727204678.39735: done checking to see if all hosts have failed 44071 1727204678.39735: getting the remaining hosts for this loop 44071 1727204678.39737: done getting the remaining hosts for this loop 44071 1727204678.39742: getting the next task for host managed-node2 44071 1727204678.39760: done getting next task for host managed-node2 44071 1727204678.39967: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204678.39976: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204678.40002: getting variables 44071 1727204678.40004: in VariableManager get_vars() 44071 1727204678.40053: Calling all_inventory to load vars for managed-node2 44071 1727204678.40060: Calling groups_inventory to load vars for managed-node2 44071 1727204678.40063: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204678.40077: Calling all_plugins_play to load vars for managed-node2 44071 1727204678.40080: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204678.40089: Calling groups_plugins_play to load vars for managed-node2 44071 1727204678.44026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204678.49080: done with get_vars() 44071 1727204678.49131: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:04:38 -0400 (0:00:00.187) 0:01:30.809 ***** 44071 1727204678.49289: entering _queue_task() for managed-node2/stat 44071 1727204678.49713: worker is 1 (out of 1 available) 44071 1727204678.49728: exiting _queue_task() for managed-node2/stat 44071 1727204678.49747: done queuing things up, now waiting for results queue to drain 44071 1727204678.49749: waiting for pending results... 44071 1727204678.50281: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204678.50289: in run() - task 127b8e07-fff9-c964-7471-000000001899 44071 1727204678.50293: variable 'ansible_search_path' from source: unknown 44071 1727204678.50296: variable 'ansible_search_path' from source: unknown 44071 1727204678.50324: calling self._execute() 44071 1727204678.50472: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204678.50477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204678.50482: variable 'omit' from source: magic vars 44071 1727204678.51160: variable 'ansible_distribution_major_version' from source: facts 44071 1727204678.51422: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204678.51860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204678.52598: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204678.52804: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204678.52850: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204678.52901: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204678.53137: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204678.53184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204678.53227: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204678.53231: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204678.53476: variable '__network_is_ostree' from source: set_fact 44071 1727204678.53481: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204678.53485: when evaluation is False, skipping this task 44071 1727204678.53487: _execute() done 44071 1727204678.53490: dumping result to json 44071 1727204678.53494: done dumping result, returning 44071 1727204678.53509: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-c964-7471-000000001899] 44071 1727204678.53512: sending task result for task 127b8e07-fff9-c964-7471-000000001899 44071 1727204678.53859: done sending task result for task 127b8e07-fff9-c964-7471-000000001899 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204678.53910: no more pending results, returning what we have 44071 1727204678.53914: results queue empty 44071 1727204678.53916: checking for any_errors_fatal 44071 1727204678.53930: done checking for any_errors_fatal 44071 1727204678.53933: checking for max_fail_percentage 44071 1727204678.53936: done checking for max_fail_percentage 44071 1727204678.53937: checking to see if all hosts have failed and the running result is not ok 44071 1727204678.53937: done checking to see if all hosts have failed 44071 1727204678.53938: getting the remaining hosts for this loop 44071 1727204678.53940: done getting the remaining hosts for this loop 44071 1727204678.53945: getting the next task for host managed-node2 44071 1727204678.53954: done getting next task for host managed-node2 44071 1727204678.53958: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204678.53967: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204678.53988: getting variables 44071 1727204678.53990: in VariableManager get_vars() 44071 1727204678.54036: Calling all_inventory to load vars for managed-node2 44071 1727204678.54039: Calling groups_inventory to load vars for managed-node2 44071 1727204678.54042: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204678.54051: Calling all_plugins_play to load vars for managed-node2 44071 1727204678.54054: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204678.54058: Calling groups_plugins_play to load vars for managed-node2 44071 1727204678.54586: WORKER PROCESS EXITING 44071 1727204678.56946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204678.59732: done with get_vars() 44071 1727204678.59771: done getting variables 44071 1727204678.59850: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:04:38 -0400 (0:00:00.106) 0:01:30.915 ***** 44071 1727204678.59900: entering _queue_task() for managed-node2/set_fact 44071 1727204678.60782: worker is 1 (out of 1 available) 44071 1727204678.60794: exiting _queue_task() for managed-node2/set_fact 44071 1727204678.60809: done queuing things up, now waiting for results queue to drain 44071 1727204678.60811: waiting for pending results... 44071 1727204678.61088: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204678.61498: in run() - task 127b8e07-fff9-c964-7471-00000000189a 44071 1727204678.61503: variable 'ansible_search_path' from source: unknown 44071 1727204678.61507: variable 'ansible_search_path' from source: unknown 44071 1727204678.61510: calling self._execute() 44071 1727204678.61583: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204678.61600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204678.61617: variable 'omit' from source: magic vars 44071 1727204678.62114: variable 'ansible_distribution_major_version' from source: facts 44071 1727204678.62149: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204678.62364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204678.62701: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204678.62759: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204678.62811: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204678.62891: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204678.62958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204678.62998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204678.63038: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204678.63074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204678.63197: variable '__network_is_ostree' from source: set_fact 44071 1727204678.63228: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204678.63232: when evaluation is False, skipping this task 44071 1727204678.63234: _execute() done 44071 1727204678.63328: dumping result to json 44071 1727204678.63333: done dumping result, returning 44071 1727204678.63337: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-00000000189a] 44071 1727204678.63339: sending task result for task 127b8e07-fff9-c964-7471-00000000189a 44071 1727204678.63420: done sending task result for task 127b8e07-fff9-c964-7471-00000000189a 44071 1727204678.63424: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204678.63484: no more pending results, returning what we have 44071 1727204678.63489: results queue empty 44071 1727204678.63491: checking for any_errors_fatal 44071 1727204678.63499: done checking for any_errors_fatal 44071 1727204678.63500: checking for max_fail_percentage 44071 1727204678.63502: done checking for max_fail_percentage 44071 1727204678.63503: checking to see if all hosts have failed and the running result is not ok 44071 1727204678.63504: done checking to see if all hosts have failed 44071 1727204678.63505: getting the remaining hosts for this loop 44071 1727204678.63507: done getting the remaining hosts for this loop 44071 1727204678.63512: getting the next task for host managed-node2 44071 1727204678.63530: done getting next task for host managed-node2 44071 1727204678.63535: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204678.63543: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204678.63569: getting variables 44071 1727204678.63571: in VariableManager get_vars() 44071 1727204678.63626: Calling all_inventory to load vars for managed-node2 44071 1727204678.63629: Calling groups_inventory to load vars for managed-node2 44071 1727204678.63631: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204678.63645: Calling all_plugins_play to load vars for managed-node2 44071 1727204678.63648: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204678.63651: Calling groups_plugins_play to load vars for managed-node2 44071 1727204678.64895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204678.66715: done with get_vars() 44071 1727204678.66762: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:04:38 -0400 (0:00:00.070) 0:01:30.985 ***** 44071 1727204678.66904: entering _queue_task() for managed-node2/service_facts 44071 1727204678.67557: worker is 1 (out of 1 available) 44071 1727204678.67574: exiting _queue_task() for managed-node2/service_facts 44071 1727204678.67601: done queuing things up, now waiting for results queue to drain 44071 1727204678.67607: waiting for pending results... 44071 1727204678.67868: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204678.68012: in run() - task 127b8e07-fff9-c964-7471-00000000189c 44071 1727204678.68034: variable 'ansible_search_path' from source: unknown 44071 1727204678.68038: variable 'ansible_search_path' from source: unknown 44071 1727204678.68076: calling self._execute() 44071 1727204678.68160: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204678.68168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204678.68178: variable 'omit' from source: magic vars 44071 1727204678.68505: variable 'ansible_distribution_major_version' from source: facts 44071 1727204678.68515: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204678.68522: variable 'omit' from source: magic vars 44071 1727204678.68593: variable 'omit' from source: magic vars 44071 1727204678.68619: variable 'omit' from source: magic vars 44071 1727204678.68662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204678.68697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204678.68715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204678.68731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204678.68744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204678.68769: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204678.68772: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204678.68777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204678.68858: Set connection var ansible_connection to ssh 44071 1727204678.68864: Set connection var ansible_timeout to 10 44071 1727204678.68871: Set connection var ansible_pipelining to False 44071 1727204678.68876: Set connection var ansible_shell_type to sh 44071 1727204678.68882: Set connection var ansible_shell_executable to /bin/sh 44071 1727204678.68889: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204678.68911: variable 'ansible_shell_executable' from source: unknown 44071 1727204678.68914: variable 'ansible_connection' from source: unknown 44071 1727204678.68918: variable 'ansible_module_compression' from source: unknown 44071 1727204678.68920: variable 'ansible_shell_type' from source: unknown 44071 1727204678.68922: variable 'ansible_shell_executable' from source: unknown 44071 1727204678.68925: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204678.68929: variable 'ansible_pipelining' from source: unknown 44071 1727204678.68932: variable 'ansible_timeout' from source: unknown 44071 1727204678.68940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204678.69112: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204678.69127: variable 'omit' from source: magic vars 44071 1727204678.69131: starting attempt loop 44071 1727204678.69133: running the handler 44071 1727204678.69145: _low_level_execute_command(): starting 44071 1727204678.69153: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204678.69725: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204678.69730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204678.69737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204678.69804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204678.69828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204678.69860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204678.69963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204678.71744: stdout chunk (state=3): >>>/root <<< 44071 1727204678.71991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204678.72035: stderr chunk (state=3): >>><<< 44071 1727204678.72039: stdout chunk (state=3): >>><<< 44071 1727204678.72044: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204678.72083: _low_level_execute_command(): starting 44071 1727204678.72116: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204678.7204518-49291-8986699327253 `" && echo ansible-tmp-1727204678.7204518-49291-8986699327253="` echo /root/.ansible/tmp/ansible-tmp-1727204678.7204518-49291-8986699327253 `" ) && sleep 0' 44071 1727204678.73072: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204678.73142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204678.73150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204678.73162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204678.73246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204678.73250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204678.73253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204678.73270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204678.73383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204678.75394: stdout chunk (state=3): >>>ansible-tmp-1727204678.7204518-49291-8986699327253=/root/.ansible/tmp/ansible-tmp-1727204678.7204518-49291-8986699327253 <<< 44071 1727204678.75486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204678.75561: stderr chunk (state=3): >>><<< 44071 1727204678.75572: stdout chunk (state=3): >>><<< 44071 1727204678.75637: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204678.7204518-49291-8986699327253=/root/.ansible/tmp/ansible-tmp-1727204678.7204518-49291-8986699327253 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204678.75658: variable 'ansible_module_compression' from source: unknown 44071 1727204678.75702: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44071 1727204678.75738: variable 'ansible_facts' from source: unknown 44071 1727204678.75800: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204678.7204518-49291-8986699327253/AnsiballZ_service_facts.py 44071 1727204678.75921: Sending initial data 44071 1727204678.75925: Sent initial data (160 bytes) 44071 1727204678.76466: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204678.76472: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204678.76525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204678.76529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204678.76537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204678.76605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204678.78294: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204678.78410: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204678.78501: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp6r1oi2ey /root/.ansible/tmp/ansible-tmp-1727204678.7204518-49291-8986699327253/AnsiballZ_service_facts.py <<< 44071 1727204678.78506: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204678.7204518-49291-8986699327253/AnsiballZ_service_facts.py" <<< 44071 1727204678.78558: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp6r1oi2ey" to remote "/root/.ansible/tmp/ansible-tmp-1727204678.7204518-49291-8986699327253/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204678.7204518-49291-8986699327253/AnsiballZ_service_facts.py" <<< 44071 1727204678.79707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204678.79827: stderr chunk (state=3): >>><<< 44071 1727204678.79831: stdout chunk (state=3): >>><<< 44071 1727204678.79903: done transferring module to remote 44071 1727204678.79906: _low_level_execute_command(): starting 44071 1727204678.79909: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204678.7204518-49291-8986699327253/ /root/.ansible/tmp/ansible-tmp-1727204678.7204518-49291-8986699327253/AnsiballZ_service_facts.py && sleep 0' 44071 1727204678.80557: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204678.80562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204678.80574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204678.80669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204678.80683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204678.80722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204678.82540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204678.82608: stderr chunk (state=3): >>><<< 44071 1727204678.82612: stdout chunk (state=3): >>><<< 44071 1727204678.82625: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204678.82628: _low_level_execute_command(): starting 44071 1727204678.82636: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204678.7204518-49291-8986699327253/AnsiballZ_service_facts.py && sleep 0' 44071 1727204678.83273: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204678.83281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204678.83285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204678.83341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204678.83443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204681.10345: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped<<< 44071 1727204681.10358: stdout chunk (state=3): >>>", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, <<< 44071 1727204681.10364: stdout chunk (state=3): >>>"systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.fre<<< 44071 1727204681.10371: stdout chunk (state=3): >>>edesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44071 1727204681.11922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204681.12094: stderr chunk (state=3): >>><<< 44071 1727204681.12098: stdout chunk (state=3): >>><<< 44071 1727204681.12203: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204681.14749: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204678.7204518-49291-8986699327253/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204681.14754: _low_level_execute_command(): starting 44071 1727204681.14756: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204678.7204518-49291-8986699327253/ > /dev/null 2>&1 && sleep 0' 44071 1727204681.15925: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204681.15963: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204681.15972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204681.16058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204681.16067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204681.16384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204681.16579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204681.16681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204681.18728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204681.18943: stderr chunk (state=3): >>><<< 44071 1727204681.18946: stdout chunk (state=3): >>><<< 44071 1727204681.18989: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204681.18995: handler run complete 44071 1727204681.19478: variable 'ansible_facts' from source: unknown 44071 1727204681.20042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204681.21454: variable 'ansible_facts' from source: unknown 44071 1727204681.21839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204681.22429: attempt loop complete, returning result 44071 1727204681.22436: _execute() done 44071 1727204681.22439: dumping result to json 44071 1727204681.22686: done dumping result, returning 44071 1727204681.22696: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-c964-7471-00000000189c] 44071 1727204681.22702: sending task result for task 127b8e07-fff9-c964-7471-00000000189c 44071 1727204681.25424: done sending task result for task 127b8e07-fff9-c964-7471-00000000189c 44071 1727204681.25428: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204681.25509: no more pending results, returning what we have 44071 1727204681.25513: results queue empty 44071 1727204681.25514: checking for any_errors_fatal 44071 1727204681.25519: done checking for any_errors_fatal 44071 1727204681.25520: checking for max_fail_percentage 44071 1727204681.25522: done checking for max_fail_percentage 44071 1727204681.25523: checking to see if all hosts have failed and the running result is not ok 44071 1727204681.25524: done checking to see if all hosts have failed 44071 1727204681.25524: getting the remaining hosts for this loop 44071 1727204681.25526: done getting the remaining hosts for this loop 44071 1727204681.25530: getting the next task for host managed-node2 44071 1727204681.25537: done getting next task for host managed-node2 44071 1727204681.25541: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204681.25551: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204681.25568: getting variables 44071 1727204681.25570: in VariableManager get_vars() 44071 1727204681.25604: Calling all_inventory to load vars for managed-node2 44071 1727204681.25608: Calling groups_inventory to load vars for managed-node2 44071 1727204681.25610: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204681.25622: Calling all_plugins_play to load vars for managed-node2 44071 1727204681.25625: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204681.25628: Calling groups_plugins_play to load vars for managed-node2 44071 1727204681.28423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204681.30753: done with get_vars() 44071 1727204681.30802: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:04:41 -0400 (0:00:02.641) 0:01:33.627 ***** 44071 1727204681.31092: entering _queue_task() for managed-node2/package_facts 44071 1727204681.31781: worker is 1 (out of 1 available) 44071 1727204681.31799: exiting _queue_task() for managed-node2/package_facts 44071 1727204681.31932: done queuing things up, now waiting for results queue to drain 44071 1727204681.31935: waiting for pending results... 44071 1727204681.32423: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204681.32580: in run() - task 127b8e07-fff9-c964-7471-00000000189d 44071 1727204681.32624: variable 'ansible_search_path' from source: unknown 44071 1727204681.32628: variable 'ansible_search_path' from source: unknown 44071 1727204681.32662: calling self._execute() 44071 1727204681.32809: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204681.32813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204681.32817: variable 'omit' from source: magic vars 44071 1727204681.33274: variable 'ansible_distribution_major_version' from source: facts 44071 1727204681.33297: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204681.33370: variable 'omit' from source: magic vars 44071 1727204681.33405: variable 'omit' from source: magic vars 44071 1727204681.33455: variable 'omit' from source: magic vars 44071 1727204681.33514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204681.33560: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204681.33589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204681.33620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204681.33640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204681.33679: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204681.33688: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204681.33695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204681.33925: Set connection var ansible_connection to ssh 44071 1727204681.33929: Set connection var ansible_timeout to 10 44071 1727204681.33933: Set connection var ansible_pipelining to False 44071 1727204681.33936: Set connection var ansible_shell_type to sh 44071 1727204681.33938: Set connection var ansible_shell_executable to /bin/sh 44071 1727204681.33940: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204681.33942: variable 'ansible_shell_executable' from source: unknown 44071 1727204681.33944: variable 'ansible_connection' from source: unknown 44071 1727204681.33946: variable 'ansible_module_compression' from source: unknown 44071 1727204681.33948: variable 'ansible_shell_type' from source: unknown 44071 1727204681.33950: variable 'ansible_shell_executable' from source: unknown 44071 1727204681.33951: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204681.33953: variable 'ansible_pipelining' from source: unknown 44071 1727204681.33955: variable 'ansible_timeout' from source: unknown 44071 1727204681.33957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204681.34181: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204681.34200: variable 'omit' from source: magic vars 44071 1727204681.34209: starting attempt loop 44071 1727204681.34215: running the handler 44071 1727204681.34239: _low_level_execute_command(): starting 44071 1727204681.34256: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204681.35096: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204681.35178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204681.35200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204681.35235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204681.35338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204681.37156: stdout chunk (state=3): >>>/root <<< 44071 1727204681.37427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204681.37674: stderr chunk (state=3): >>><<< 44071 1727204681.37678: stdout chunk (state=3): >>><<< 44071 1727204681.37682: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204681.37685: _low_level_execute_command(): starting 44071 1727204681.37688: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204681.3755126-49407-113020551367856 `" && echo ansible-tmp-1727204681.3755126-49407-113020551367856="` echo /root/.ansible/tmp/ansible-tmp-1727204681.3755126-49407-113020551367856 `" ) && sleep 0' 44071 1727204681.38342: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204681.38448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204681.38486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204681.38505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204681.38530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204681.38674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204681.40658: stdout chunk (state=3): >>>ansible-tmp-1727204681.3755126-49407-113020551367856=/root/.ansible/tmp/ansible-tmp-1727204681.3755126-49407-113020551367856 <<< 44071 1727204681.41288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204681.41293: stdout chunk (state=3): >>><<< 44071 1727204681.41296: stderr chunk (state=3): >>><<< 44071 1727204681.41300: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204681.3755126-49407-113020551367856=/root/.ansible/tmp/ansible-tmp-1727204681.3755126-49407-113020551367856 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204681.41302: variable 'ansible_module_compression' from source: unknown 44071 1727204681.41305: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44071 1727204681.41408: variable 'ansible_facts' from source: unknown 44071 1727204681.41812: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204681.3755126-49407-113020551367856/AnsiballZ_package_facts.py 44071 1727204681.42332: Sending initial data 44071 1727204681.42337: Sent initial data (162 bytes) 44071 1727204681.43562: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204681.43571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204681.43588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204681.43733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204681.43948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204681.44058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204681.44107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204681.45727: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204681.45763: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204681.45829: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204681.45896: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp5b0jvpxa /root/.ansible/tmp/ansible-tmp-1727204681.3755126-49407-113020551367856/AnsiballZ_package_facts.py <<< 44071 1727204681.45908: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204681.3755126-49407-113020551367856/AnsiballZ_package_facts.py" <<< 44071 1727204681.45970: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 44071 1727204681.45986: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp5b0jvpxa" to remote "/root/.ansible/tmp/ansible-tmp-1727204681.3755126-49407-113020551367856/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204681.3755126-49407-113020551367856/AnsiballZ_package_facts.py" <<< 44071 1727204681.47401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204681.47412: stderr chunk (state=3): >>><<< 44071 1727204681.47415: stdout chunk (state=3): >>><<< 44071 1727204681.47448: done transferring module to remote 44071 1727204681.47459: _low_level_execute_command(): starting 44071 1727204681.47464: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204681.3755126-49407-113020551367856/ /root/.ansible/tmp/ansible-tmp-1727204681.3755126-49407-113020551367856/AnsiballZ_package_facts.py && sleep 0' 44071 1727204681.48183: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204681.48201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204681.48217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204681.48237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204681.48253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204681.48264: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204681.48281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204681.48299: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204681.48395: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204681.48427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204681.48541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204681.50377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204681.50442: stderr chunk (state=3): >>><<< 44071 1727204681.50446: stdout chunk (state=3): >>><<< 44071 1727204681.50458: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204681.50461: _low_level_execute_command(): starting 44071 1727204681.50468: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204681.3755126-49407-113020551367856/AnsiballZ_package_facts.py && sleep 0' 44071 1727204681.50946: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204681.50951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204681.50982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204681.51049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204681.51057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204681.51059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204681.51134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204682.14279: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 44071 1727204682.14304: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lib<<< 44071 1727204682.14318: stdout chunk (state=3): >>>xmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarc<<< 44071 1727204682.14335: stdout chunk (state=3): >>>h", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 44071 1727204682.14342: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "s<<< 44071 1727204682.14352: stdout chunk (state=3): >>>ource": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44071 1727204682.16319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204682.16324: stdout chunk (state=3): >>><<< 44071 1727204682.16327: stderr chunk (state=3): >>><<< 44071 1727204682.16587: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204682.19948: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204681.3755126-49407-113020551367856/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204682.19984: _low_level_execute_command(): starting 44071 1727204682.19997: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204681.3755126-49407-113020551367856/ > /dev/null 2>&1 && sleep 0' 44071 1727204682.20903: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204682.20909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204682.20947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204682.21050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204682.23123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204682.23127: stdout chunk (state=3): >>><<< 44071 1727204682.23130: stderr chunk (state=3): >>><<< 44071 1727204682.23154: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204682.23174: handler run complete 44071 1727204682.24534: variable 'ansible_facts' from source: unknown 44071 1727204682.25286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204682.28308: variable 'ansible_facts' from source: unknown 44071 1727204682.29099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204682.30157: attempt loop complete, returning result 44071 1727204682.30203: _execute() done 44071 1727204682.30212: dumping result to json 44071 1727204682.30572: done dumping result, returning 44071 1727204682.30592: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-c964-7471-00000000189d] 44071 1727204682.30604: sending task result for task 127b8e07-fff9-c964-7471-00000000189d 44071 1727204682.34848: done sending task result for task 127b8e07-fff9-c964-7471-00000000189d 44071 1727204682.34852: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204682.35036: no more pending results, returning what we have 44071 1727204682.35040: results queue empty 44071 1727204682.35041: checking for any_errors_fatal 44071 1727204682.35046: done checking for any_errors_fatal 44071 1727204682.35047: checking for max_fail_percentage 44071 1727204682.35049: done checking for max_fail_percentage 44071 1727204682.35050: checking to see if all hosts have failed and the running result is not ok 44071 1727204682.35050: done checking to see if all hosts have failed 44071 1727204682.35051: getting the remaining hosts for this loop 44071 1727204682.35053: done getting the remaining hosts for this loop 44071 1727204682.35057: getting the next task for host managed-node2 44071 1727204682.35073: done getting next task for host managed-node2 44071 1727204682.35078: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204682.35085: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204682.35100: getting variables 44071 1727204682.35102: in VariableManager get_vars() 44071 1727204682.35139: Calling all_inventory to load vars for managed-node2 44071 1727204682.35143: Calling groups_inventory to load vars for managed-node2 44071 1727204682.35145: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204682.35156: Calling all_plugins_play to load vars for managed-node2 44071 1727204682.35159: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204682.35163: Calling groups_plugins_play to load vars for managed-node2 44071 1727204682.37649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204682.41569: done with get_vars() 44071 1727204682.41615: done getting variables 44071 1727204682.41699: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:04:42 -0400 (0:00:01.106) 0:01:34.733 ***** 44071 1727204682.41748: entering _queue_task() for managed-node2/debug 44071 1727204682.42219: worker is 1 (out of 1 available) 44071 1727204682.42355: exiting _queue_task() for managed-node2/debug 44071 1727204682.42371: done queuing things up, now waiting for results queue to drain 44071 1727204682.42373: waiting for pending results... 44071 1727204682.42803: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204682.43047: in run() - task 127b8e07-fff9-c964-7471-00000000183b 44071 1727204682.43140: variable 'ansible_search_path' from source: unknown 44071 1727204682.43149: variable 'ansible_search_path' from source: unknown 44071 1727204682.43277: calling self._execute() 44071 1727204682.43771: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204682.43778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204682.43782: variable 'omit' from source: magic vars 44071 1727204682.44376: variable 'ansible_distribution_major_version' from source: facts 44071 1727204682.44402: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204682.44416: variable 'omit' from source: magic vars 44071 1727204682.44502: variable 'omit' from source: magic vars 44071 1727204682.44634: variable 'network_provider' from source: set_fact 44071 1727204682.44672: variable 'omit' from source: magic vars 44071 1727204682.44725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204682.44780: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204682.44812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204682.44838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204682.44857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204682.44907: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204682.44994: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204682.45000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204682.45063: Set connection var ansible_connection to ssh 44071 1727204682.45079: Set connection var ansible_timeout to 10 44071 1727204682.45103: Set connection var ansible_pipelining to False 44071 1727204682.45106: Set connection var ansible_shell_type to sh 44071 1727204682.45213: Set connection var ansible_shell_executable to /bin/sh 44071 1727204682.45217: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204682.45220: variable 'ansible_shell_executable' from source: unknown 44071 1727204682.45223: variable 'ansible_connection' from source: unknown 44071 1727204682.45225: variable 'ansible_module_compression' from source: unknown 44071 1727204682.45227: variable 'ansible_shell_type' from source: unknown 44071 1727204682.45229: variable 'ansible_shell_executable' from source: unknown 44071 1727204682.45234: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204682.45236: variable 'ansible_pipelining' from source: unknown 44071 1727204682.45239: variable 'ansible_timeout' from source: unknown 44071 1727204682.45241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204682.45403: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204682.45465: variable 'omit' from source: magic vars 44071 1727204682.45469: starting attempt loop 44071 1727204682.45474: running the handler 44071 1727204682.45517: handler run complete 44071 1727204682.45549: attempt loop complete, returning result 44071 1727204682.45561: _execute() done 44071 1727204682.45650: dumping result to json 44071 1727204682.45654: done dumping result, returning 44071 1727204682.45657: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-c964-7471-00000000183b] 44071 1727204682.45659: sending task result for task 127b8e07-fff9-c964-7471-00000000183b ok: [managed-node2] => {} MSG: Using network provider: nm 44071 1727204682.45949: no more pending results, returning what we have 44071 1727204682.45954: results queue empty 44071 1727204682.45954: checking for any_errors_fatal 44071 1727204682.45969: done checking for any_errors_fatal 44071 1727204682.45970: checking for max_fail_percentage 44071 1727204682.45972: done checking for max_fail_percentage 44071 1727204682.45973: checking to see if all hosts have failed and the running result is not ok 44071 1727204682.45974: done checking to see if all hosts have failed 44071 1727204682.45974: getting the remaining hosts for this loop 44071 1727204682.45977: done getting the remaining hosts for this loop 44071 1727204682.45982: getting the next task for host managed-node2 44071 1727204682.45993: done getting next task for host managed-node2 44071 1727204682.45998: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204682.46005: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204682.46021: getting variables 44071 1727204682.46023: in VariableManager get_vars() 44071 1727204682.46198: Calling all_inventory to load vars for managed-node2 44071 1727204682.46201: Calling groups_inventory to load vars for managed-node2 44071 1727204682.46203: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204682.46212: done sending task result for task 127b8e07-fff9-c964-7471-00000000183b 44071 1727204682.46215: WORKER PROCESS EXITING 44071 1727204682.46226: Calling all_plugins_play to load vars for managed-node2 44071 1727204682.46229: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204682.46235: Calling groups_plugins_play to load vars for managed-node2 44071 1727204682.49209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204682.52055: done with get_vars() 44071 1727204682.52104: done getting variables 44071 1727204682.52483: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:04:42 -0400 (0:00:00.107) 0:01:34.841 ***** 44071 1727204682.52541: entering _queue_task() for managed-node2/fail 44071 1727204682.53428: worker is 1 (out of 1 available) 44071 1727204682.53448: exiting _queue_task() for managed-node2/fail 44071 1727204682.53463: done queuing things up, now waiting for results queue to drain 44071 1727204682.53771: waiting for pending results... 44071 1727204682.54152: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204682.54568: in run() - task 127b8e07-fff9-c964-7471-00000000183c 44071 1727204682.54842: variable 'ansible_search_path' from source: unknown 44071 1727204682.54847: variable 'ansible_search_path' from source: unknown 44071 1727204682.54850: calling self._execute() 44071 1727204682.54940: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204682.55084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204682.55168: variable 'omit' from source: magic vars 44071 1727204682.56126: variable 'ansible_distribution_major_version' from source: facts 44071 1727204682.56196: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204682.56723: variable 'network_state' from source: role '' defaults 44071 1727204682.56728: Evaluated conditional (network_state != {}): False 44071 1727204682.56730: when evaluation is False, skipping this task 44071 1727204682.56736: _execute() done 44071 1727204682.56739: dumping result to json 44071 1727204682.56742: done dumping result, returning 44071 1727204682.56744: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-c964-7471-00000000183c] 44071 1727204682.56747: sending task result for task 127b8e07-fff9-c964-7471-00000000183c skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204682.57104: no more pending results, returning what we have 44071 1727204682.57109: results queue empty 44071 1727204682.57111: checking for any_errors_fatal 44071 1727204682.57118: done checking for any_errors_fatal 44071 1727204682.57119: checking for max_fail_percentage 44071 1727204682.57121: done checking for max_fail_percentage 44071 1727204682.57122: checking to see if all hosts have failed and the running result is not ok 44071 1727204682.57123: done checking to see if all hosts have failed 44071 1727204682.57123: getting the remaining hosts for this loop 44071 1727204682.57125: done getting the remaining hosts for this loop 44071 1727204682.57130: getting the next task for host managed-node2 44071 1727204682.57144: done getting next task for host managed-node2 44071 1727204682.57150: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204682.57157: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204682.57190: getting variables 44071 1727204682.57193: in VariableManager get_vars() 44071 1727204682.57253: Calling all_inventory to load vars for managed-node2 44071 1727204682.57257: Calling groups_inventory to load vars for managed-node2 44071 1727204682.57260: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204682.57499: Calling all_plugins_play to load vars for managed-node2 44071 1727204682.57504: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204682.57509: Calling groups_plugins_play to load vars for managed-node2 44071 1727204682.58184: done sending task result for task 127b8e07-fff9-c964-7471-00000000183c 44071 1727204682.58189: WORKER PROCESS EXITING 44071 1727204682.79461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204682.83884: done with get_vars() 44071 1727204682.83998: done getting variables 44071 1727204682.84069: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:04:42 -0400 (0:00:00.315) 0:01:35.157 ***** 44071 1727204682.84105: entering _queue_task() for managed-node2/fail 44071 1727204682.84720: worker is 1 (out of 1 available) 44071 1727204682.84738: exiting _queue_task() for managed-node2/fail 44071 1727204682.84752: done queuing things up, now waiting for results queue to drain 44071 1727204682.84754: waiting for pending results... 44071 1727204682.85143: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204682.85441: in run() - task 127b8e07-fff9-c964-7471-00000000183d 44071 1727204682.85446: variable 'ansible_search_path' from source: unknown 44071 1727204682.85448: variable 'ansible_search_path' from source: unknown 44071 1727204682.85462: calling self._execute() 44071 1727204682.85608: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204682.85626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204682.85651: variable 'omit' from source: magic vars 44071 1727204682.86193: variable 'ansible_distribution_major_version' from source: facts 44071 1727204682.86199: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204682.86326: variable 'network_state' from source: role '' defaults 44071 1727204682.86350: Evaluated conditional (network_state != {}): False 44071 1727204682.86358: when evaluation is False, skipping this task 44071 1727204682.86364: _execute() done 44071 1727204682.86375: dumping result to json 44071 1727204682.86382: done dumping result, returning 44071 1727204682.86395: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-c964-7471-00000000183d] 44071 1727204682.86434: sending task result for task 127b8e07-fff9-c964-7471-00000000183d skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204682.86638: no more pending results, returning what we have 44071 1727204682.86644: results queue empty 44071 1727204682.86645: checking for any_errors_fatal 44071 1727204682.86664: done checking for any_errors_fatal 44071 1727204682.86870: checking for max_fail_percentage 44071 1727204682.86873: done checking for max_fail_percentage 44071 1727204682.86874: checking to see if all hosts have failed and the running result is not ok 44071 1727204682.86875: done checking to see if all hosts have failed 44071 1727204682.86876: getting the remaining hosts for this loop 44071 1727204682.86880: done getting the remaining hosts for this loop 44071 1727204682.86885: getting the next task for host managed-node2 44071 1727204682.86897: done getting next task for host managed-node2 44071 1727204682.86902: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204682.86908: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204682.86934: getting variables 44071 1727204682.86936: in VariableManager get_vars() 44071 1727204682.87002: Calling all_inventory to load vars for managed-node2 44071 1727204682.87006: Calling groups_inventory to load vars for managed-node2 44071 1727204682.87008: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204682.87085: done sending task result for task 127b8e07-fff9-c964-7471-00000000183d 44071 1727204682.87090: WORKER PROCESS EXITING 44071 1727204682.87104: Calling all_plugins_play to load vars for managed-node2 44071 1727204682.87108: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204682.87111: Calling groups_plugins_play to load vars for managed-node2 44071 1727204682.90226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204682.93802: done with get_vars() 44071 1727204682.93911: done getting variables 44071 1727204682.94080: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:04:42 -0400 (0:00:00.100) 0:01:35.258 ***** 44071 1727204682.94184: entering _queue_task() for managed-node2/fail 44071 1727204682.95199: worker is 1 (out of 1 available) 44071 1727204682.95216: exiting _queue_task() for managed-node2/fail 44071 1727204682.95230: done queuing things up, now waiting for results queue to drain 44071 1727204682.95234: waiting for pending results... 44071 1727204682.95928: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204682.96421: in run() - task 127b8e07-fff9-c964-7471-00000000183e 44071 1727204682.96474: variable 'ansible_search_path' from source: unknown 44071 1727204682.96507: variable 'ansible_search_path' from source: unknown 44071 1727204682.96619: calling self._execute() 44071 1727204682.96941: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204682.96945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204682.96959: variable 'omit' from source: magic vars 44071 1727204682.98007: variable 'ansible_distribution_major_version' from source: facts 44071 1727204682.98081: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204682.98655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204683.06228: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204683.06547: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204683.06595: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204683.06634: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204683.06826: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204683.06960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204683.06997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204683.07025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204683.07070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204683.07291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204683.07423: variable 'ansible_distribution_major_version' from source: facts 44071 1727204683.07441: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44071 1727204683.07906: variable 'ansible_distribution' from source: facts 44071 1727204683.07911: variable '__network_rh_distros' from source: role '' defaults 44071 1727204683.07926: Evaluated conditional (ansible_distribution in __network_rh_distros): False 44071 1727204683.08081: when evaluation is False, skipping this task 44071 1727204683.08088: _execute() done 44071 1727204683.08092: dumping result to json 44071 1727204683.08095: done dumping result, returning 44071 1727204683.08098: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-c964-7471-00000000183e] 44071 1727204683.08100: sending task result for task 127b8e07-fff9-c964-7471-00000000183e 44071 1727204683.08875: done sending task result for task 127b8e07-fff9-c964-7471-00000000183e 44071 1727204683.08885: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 44071 1727204683.09062: no more pending results, returning what we have 44071 1727204683.09069: results queue empty 44071 1727204683.09071: checking for any_errors_fatal 44071 1727204683.09078: done checking for any_errors_fatal 44071 1727204683.09078: checking for max_fail_percentage 44071 1727204683.09080: done checking for max_fail_percentage 44071 1727204683.09083: checking to see if all hosts have failed and the running result is not ok 44071 1727204683.09084: done checking to see if all hosts have failed 44071 1727204683.09087: getting the remaining hosts for this loop 44071 1727204683.09090: done getting the remaining hosts for this loop 44071 1727204683.09095: getting the next task for host managed-node2 44071 1727204683.09104: done getting next task for host managed-node2 44071 1727204683.09111: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204683.09117: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204683.09139: getting variables 44071 1727204683.09141: in VariableManager get_vars() 44071 1727204683.09418: Calling all_inventory to load vars for managed-node2 44071 1727204683.09421: Calling groups_inventory to load vars for managed-node2 44071 1727204683.09424: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204683.09437: Calling all_plugins_play to load vars for managed-node2 44071 1727204683.09440: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204683.09443: Calling groups_plugins_play to load vars for managed-node2 44071 1727204683.19096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204683.26489: done with get_vars() 44071 1727204683.26542: done getting variables 44071 1727204683.26774: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:04:43 -0400 (0:00:00.326) 0:01:35.584 ***** 44071 1727204683.26817: entering _queue_task() for managed-node2/dnf 44071 1727204683.28251: worker is 1 (out of 1 available) 44071 1727204683.28599: exiting _queue_task() for managed-node2/dnf 44071 1727204683.28614: done queuing things up, now waiting for results queue to drain 44071 1727204683.28615: waiting for pending results... 44071 1727204683.29288: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204683.29863: in run() - task 127b8e07-fff9-c964-7471-00000000183f 44071 1727204683.29871: variable 'ansible_search_path' from source: unknown 44071 1727204683.29874: variable 'ansible_search_path' from source: unknown 44071 1727204683.30000: calling self._execute() 44071 1727204683.30373: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204683.30377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204683.30380: variable 'omit' from source: magic vars 44071 1727204683.31562: variable 'ansible_distribution_major_version' from source: facts 44071 1727204683.31590: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204683.32378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204683.38531: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204683.38850: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204683.39040: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204683.39208: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204683.39276: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204683.39512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204683.39674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204683.39716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204683.39767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204683.39836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204683.40320: variable 'ansible_distribution' from source: facts 44071 1727204683.40332: variable 'ansible_distribution_major_version' from source: facts 44071 1727204683.40350: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44071 1727204683.40692: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204683.41072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204683.41193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204683.41270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204683.41408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204683.41416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204683.41468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204683.41627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204683.41735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204683.41792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204683.41856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204683.41917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204683.42079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204683.42118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204683.42180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204683.42245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204683.42534: variable 'network_connections' from source: include params 44071 1727204683.42558: variable 'interface' from source: play vars 44071 1727204683.42653: variable 'interface' from source: play vars 44071 1727204683.42760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204683.42982: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204683.43517: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204683.43573: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204683.43717: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204683.43721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204683.43748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204683.43793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204683.43825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204683.43933: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204683.44243: variable 'network_connections' from source: include params 44071 1727204683.44260: variable 'interface' from source: play vars 44071 1727204683.44391: variable 'interface' from source: play vars 44071 1727204683.44449: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204683.44474: when evaluation is False, skipping this task 44071 1727204683.44500: _execute() done 44071 1727204683.44548: dumping result to json 44071 1727204683.44590: done dumping result, returning 44071 1727204683.44715: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-00000000183f] 44071 1727204683.44718: sending task result for task 127b8e07-fff9-c964-7471-00000000183f 44071 1727204683.44801: done sending task result for task 127b8e07-fff9-c964-7471-00000000183f 44071 1727204683.44804: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204683.44902: no more pending results, returning what we have 44071 1727204683.44907: results queue empty 44071 1727204683.44908: checking for any_errors_fatal 44071 1727204683.44918: done checking for any_errors_fatal 44071 1727204683.44919: checking for max_fail_percentage 44071 1727204683.44921: done checking for max_fail_percentage 44071 1727204683.44922: checking to see if all hosts have failed and the running result is not ok 44071 1727204683.44923: done checking to see if all hosts have failed 44071 1727204683.44923: getting the remaining hosts for this loop 44071 1727204683.44925: done getting the remaining hosts for this loop 44071 1727204683.44930: getting the next task for host managed-node2 44071 1727204683.44940: done getting next task for host managed-node2 44071 1727204683.44945: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204683.44950: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204683.44977: getting variables 44071 1727204683.44979: in VariableManager get_vars() 44071 1727204683.45026: Calling all_inventory to load vars for managed-node2 44071 1727204683.45029: Calling groups_inventory to load vars for managed-node2 44071 1727204683.45031: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204683.45044: Calling all_plugins_play to load vars for managed-node2 44071 1727204683.45048: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204683.45051: Calling groups_plugins_play to load vars for managed-node2 44071 1727204683.49334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204683.54828: done with get_vars() 44071 1727204683.54954: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204683.55131: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:04:43 -0400 (0:00:00.283) 0:01:35.868 ***** 44071 1727204683.55172: entering _queue_task() for managed-node2/yum 44071 1727204683.55833: worker is 1 (out of 1 available) 44071 1727204683.55848: exiting _queue_task() for managed-node2/yum 44071 1727204683.56022: done queuing things up, now waiting for results queue to drain 44071 1727204683.56024: waiting for pending results... 44071 1727204683.56529: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204683.56575: in run() - task 127b8e07-fff9-c964-7471-000000001840 44071 1727204683.56598: variable 'ansible_search_path' from source: unknown 44071 1727204683.56620: variable 'ansible_search_path' from source: unknown 44071 1727204683.56663: calling self._execute() 44071 1727204683.56792: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204683.56807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204683.56824: variable 'omit' from source: magic vars 44071 1727204683.57297: variable 'ansible_distribution_major_version' from source: facts 44071 1727204683.57319: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204683.57637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204683.61951: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204683.62168: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204683.62204: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204683.62325: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204683.62370: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204683.62480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204683.62522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204683.62556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204683.62621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204683.62673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204683.62772: variable 'ansible_distribution_major_version' from source: facts 44071 1727204683.62803: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44071 1727204683.62870: when evaluation is False, skipping this task 44071 1727204683.62874: _execute() done 44071 1727204683.62876: dumping result to json 44071 1727204683.62878: done dumping result, returning 44071 1727204683.62881: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001840] 44071 1727204683.62884: sending task result for task 127b8e07-fff9-c964-7471-000000001840 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44071 1727204683.63253: no more pending results, returning what we have 44071 1727204683.63258: results queue empty 44071 1727204683.63260: checking for any_errors_fatal 44071 1727204683.63271: done checking for any_errors_fatal 44071 1727204683.63272: checking for max_fail_percentage 44071 1727204683.63274: done checking for max_fail_percentage 44071 1727204683.63275: checking to see if all hosts have failed and the running result is not ok 44071 1727204683.63276: done checking to see if all hosts have failed 44071 1727204683.63277: getting the remaining hosts for this loop 44071 1727204683.63279: done getting the remaining hosts for this loop 44071 1727204683.63284: getting the next task for host managed-node2 44071 1727204683.63295: done getting next task for host managed-node2 44071 1727204683.63300: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204683.63306: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204683.63333: getting variables 44071 1727204683.63335: in VariableManager get_vars() 44071 1727204683.63823: Calling all_inventory to load vars for managed-node2 44071 1727204683.63827: Calling groups_inventory to load vars for managed-node2 44071 1727204683.63829: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204683.63841: Calling all_plugins_play to load vars for managed-node2 44071 1727204683.63844: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204683.63847: Calling groups_plugins_play to load vars for managed-node2 44071 1727204683.64574: done sending task result for task 127b8e07-fff9-c964-7471-000000001840 44071 1727204683.64579: WORKER PROCESS EXITING 44071 1727204683.67464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204683.70131: done with get_vars() 44071 1727204683.70293: done getting variables 44071 1727204683.70472: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:04:43 -0400 (0:00:00.153) 0:01:36.021 ***** 44071 1727204683.70516: entering _queue_task() for managed-node2/fail 44071 1727204683.71435: worker is 1 (out of 1 available) 44071 1727204683.71450: exiting _queue_task() for managed-node2/fail 44071 1727204683.71464: done queuing things up, now waiting for results queue to drain 44071 1727204683.71468: waiting for pending results... 44071 1727204683.71740: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204683.71928: in run() - task 127b8e07-fff9-c964-7471-000000001841 44071 1727204683.71956: variable 'ansible_search_path' from source: unknown 44071 1727204683.71967: variable 'ansible_search_path' from source: unknown 44071 1727204683.72012: calling self._execute() 44071 1727204683.72137: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204683.72170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204683.72175: variable 'omit' from source: magic vars 44071 1727204683.72623: variable 'ansible_distribution_major_version' from source: facts 44071 1727204683.72721: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204683.72785: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204683.73013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204683.76815: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204683.77037: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204683.77375: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204683.77380: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204683.77383: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204683.77580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204683.77744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204683.77804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204683.77994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204683.77998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204683.78128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204683.78235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204683.78426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204683.78429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204683.78432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204683.78574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204683.78605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204683.78671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204683.78798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204683.78820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204683.79164: variable 'network_connections' from source: include params 44071 1727204683.79311: variable 'interface' from source: play vars 44071 1727204683.79472: variable 'interface' from source: play vars 44071 1727204683.79686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204683.80129: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204683.80436: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204683.80652: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204683.80783: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204683.80904: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204683.81072: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204683.81150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204683.81270: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204683.81416: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204683.82401: variable 'network_connections' from source: include params 44071 1727204683.82419: variable 'interface' from source: play vars 44071 1727204683.82585: variable 'interface' from source: play vars 44071 1727204683.82716: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204683.82729: when evaluation is False, skipping this task 44071 1727204683.82740: _execute() done 44071 1727204683.82872: dumping result to json 44071 1727204683.82876: done dumping result, returning 44071 1727204683.82879: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001841] 44071 1727204683.82881: sending task result for task 127b8e07-fff9-c964-7471-000000001841 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204683.83138: no more pending results, returning what we have 44071 1727204683.83142: results queue empty 44071 1727204683.83143: checking for any_errors_fatal 44071 1727204683.83151: done checking for any_errors_fatal 44071 1727204683.83152: checking for max_fail_percentage 44071 1727204683.83154: done checking for max_fail_percentage 44071 1727204683.83155: checking to see if all hosts have failed and the running result is not ok 44071 1727204683.83156: done checking to see if all hosts have failed 44071 1727204683.83156: getting the remaining hosts for this loop 44071 1727204683.83159: done getting the remaining hosts for this loop 44071 1727204683.83164: getting the next task for host managed-node2 44071 1727204683.83178: done getting next task for host managed-node2 44071 1727204683.83183: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44071 1727204683.83189: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204683.83215: getting variables 44071 1727204683.83217: in VariableManager get_vars() 44071 1727204683.83384: Calling all_inventory to load vars for managed-node2 44071 1727204683.83388: Calling groups_inventory to load vars for managed-node2 44071 1727204683.83392: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204683.83405: Calling all_plugins_play to load vars for managed-node2 44071 1727204683.83408: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204683.83411: Calling groups_plugins_play to load vars for managed-node2 44071 1727204683.84511: done sending task result for task 127b8e07-fff9-c964-7471-000000001841 44071 1727204683.84516: WORKER PROCESS EXITING 44071 1727204683.87235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204683.90846: done with get_vars() 44071 1727204683.90892: done getting variables 44071 1727204683.90970: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:04:43 -0400 (0:00:00.204) 0:01:36.226 ***** 44071 1727204683.91012: entering _queue_task() for managed-node2/package 44071 1727204683.91457: worker is 1 (out of 1 available) 44071 1727204683.91577: exiting _queue_task() for managed-node2/package 44071 1727204683.91591: done queuing things up, now waiting for results queue to drain 44071 1727204683.91593: waiting for pending results... 44071 1727204683.91856: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 44071 1727204683.92046: in run() - task 127b8e07-fff9-c964-7471-000000001842 44071 1727204683.92073: variable 'ansible_search_path' from source: unknown 44071 1727204683.92082: variable 'ansible_search_path' from source: unknown 44071 1727204683.92128: calling self._execute() 44071 1727204683.92251: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204683.92269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204683.92288: variable 'omit' from source: magic vars 44071 1727204683.93140: variable 'ansible_distribution_major_version' from source: facts 44071 1727204683.93193: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204683.93775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204683.94511: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204683.94544: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204683.94834: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204683.94909: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204683.95370: variable 'network_packages' from source: role '' defaults 44071 1727204683.95586: variable '__network_provider_setup' from source: role '' defaults 44071 1727204683.95599: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204683.95680: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204683.95908: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204683.96008: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204683.96478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204684.03399: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204684.03848: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204684.04374: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204684.04379: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204684.04382: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204684.05469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204684.05474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204684.05585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204684.05976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204684.05981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204684.05983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204684.06031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204684.06373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204684.06377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204684.06379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204684.06994: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204684.07414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204684.07445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204684.07763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204684.07928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204684.07943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204684.08241: variable 'ansible_python' from source: facts 44071 1727204684.08262: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204684.08572: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204684.08716: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204684.09076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204684.09110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204684.09210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204684.09260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204684.09278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204684.09472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204684.09485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204684.09507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204684.09669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204684.09688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204684.09968: variable 'network_connections' from source: include params 44071 1727204684.10273: variable 'interface' from source: play vars 44071 1727204684.10302: variable 'interface' from source: play vars 44071 1727204684.10512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204684.10597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204684.10786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204684.10829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204684.10886: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204684.11725: variable 'network_connections' from source: include params 44071 1727204684.11729: variable 'interface' from source: play vars 44071 1727204684.11973: variable 'interface' from source: play vars 44071 1727204684.12173: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204684.12370: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204684.13274: variable 'network_connections' from source: include params 44071 1727204684.13279: variable 'interface' from source: play vars 44071 1727204684.13340: variable 'interface' from source: play vars 44071 1727204684.13370: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204684.13775: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204684.15091: variable 'network_connections' from source: include params 44071 1727204684.15096: variable 'interface' from source: play vars 44071 1727204684.15294: variable 'interface' from source: play vars 44071 1727204684.15369: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204684.15745: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204684.15752: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204684.15826: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204684.17075: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204684.18564: variable 'network_connections' from source: include params 44071 1727204684.18621: variable 'interface' from source: play vars 44071 1727204684.18871: variable 'interface' from source: play vars 44071 1727204684.18875: variable 'ansible_distribution' from source: facts 44071 1727204684.18877: variable '__network_rh_distros' from source: role '' defaults 44071 1727204684.18879: variable 'ansible_distribution_major_version' from source: facts 44071 1727204684.18881: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204684.19473: variable 'ansible_distribution' from source: facts 44071 1727204684.19477: variable '__network_rh_distros' from source: role '' defaults 44071 1727204684.19479: variable 'ansible_distribution_major_version' from source: facts 44071 1727204684.19482: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204684.19978: variable 'ansible_distribution' from source: facts 44071 1727204684.19982: variable '__network_rh_distros' from source: role '' defaults 44071 1727204684.19985: variable 'ansible_distribution_major_version' from source: facts 44071 1727204684.20034: variable 'network_provider' from source: set_fact 44071 1727204684.20172: variable 'ansible_facts' from source: unknown 44071 1727204684.22402: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44071 1727204684.22406: when evaluation is False, skipping this task 44071 1727204684.22409: _execute() done 44071 1727204684.22411: dumping result to json 44071 1727204684.22413: done dumping result, returning 44071 1727204684.22473: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-c964-7471-000000001842] 44071 1727204684.22476: sending task result for task 127b8e07-fff9-c964-7471-000000001842 44071 1727204684.23052: done sending task result for task 127b8e07-fff9-c964-7471-000000001842 44071 1727204684.23055: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44071 1727204684.23150: no more pending results, returning what we have 44071 1727204684.23156: results queue empty 44071 1727204684.23157: checking for any_errors_fatal 44071 1727204684.23168: done checking for any_errors_fatal 44071 1727204684.23169: checking for max_fail_percentage 44071 1727204684.23171: done checking for max_fail_percentage 44071 1727204684.23172: checking to see if all hosts have failed and the running result is not ok 44071 1727204684.23173: done checking to see if all hosts have failed 44071 1727204684.23174: getting the remaining hosts for this loop 44071 1727204684.23176: done getting the remaining hosts for this loop 44071 1727204684.23181: getting the next task for host managed-node2 44071 1727204684.23190: done getting next task for host managed-node2 44071 1727204684.23201: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204684.23207: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204684.23231: getting variables 44071 1727204684.23233: in VariableManager get_vars() 44071 1727204684.23725: Calling all_inventory to load vars for managed-node2 44071 1727204684.23729: Calling groups_inventory to load vars for managed-node2 44071 1727204684.23732: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204684.23746: Calling all_plugins_play to load vars for managed-node2 44071 1727204684.23749: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204684.23753: Calling groups_plugins_play to load vars for managed-node2 44071 1727204684.29003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204684.35183: done with get_vars() 44071 1727204684.35238: done getting variables 44071 1727204684.35427: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:04:44 -0400 (0:00:00.444) 0:01:36.671 ***** 44071 1727204684.35471: entering _queue_task() for managed-node2/package 44071 1727204684.36430: worker is 1 (out of 1 available) 44071 1727204684.36444: exiting _queue_task() for managed-node2/package 44071 1727204684.36460: done queuing things up, now waiting for results queue to drain 44071 1727204684.36461: waiting for pending results... 44071 1727204684.37188: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204684.37301: in run() - task 127b8e07-fff9-c964-7471-000000001843 44071 1727204684.37319: variable 'ansible_search_path' from source: unknown 44071 1727204684.37451: variable 'ansible_search_path' from source: unknown 44071 1727204684.37773: calling self._execute() 44071 1727204684.37808: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204684.37815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204684.37826: variable 'omit' from source: magic vars 44071 1727204684.38744: variable 'ansible_distribution_major_version' from source: facts 44071 1727204684.38879: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204684.39139: variable 'network_state' from source: role '' defaults 44071 1727204684.39187: Evaluated conditional (network_state != {}): False 44071 1727204684.39191: when evaluation is False, skipping this task 44071 1727204684.39194: _execute() done 44071 1727204684.39196: dumping result to json 44071 1727204684.39198: done dumping result, returning 44071 1727204684.39201: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-c964-7471-000000001843] 44071 1727204684.39301: sending task result for task 127b8e07-fff9-c964-7471-000000001843 44071 1727204684.39420: done sending task result for task 127b8e07-fff9-c964-7471-000000001843 44071 1727204684.39425: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204684.39484: no more pending results, returning what we have 44071 1727204684.39489: results queue empty 44071 1727204684.39491: checking for any_errors_fatal 44071 1727204684.39498: done checking for any_errors_fatal 44071 1727204684.39499: checking for max_fail_percentage 44071 1727204684.39501: done checking for max_fail_percentage 44071 1727204684.39502: checking to see if all hosts have failed and the running result is not ok 44071 1727204684.39505: done checking to see if all hosts have failed 44071 1727204684.39506: getting the remaining hosts for this loop 44071 1727204684.39508: done getting the remaining hosts for this loop 44071 1727204684.39513: getting the next task for host managed-node2 44071 1727204684.39525: done getting next task for host managed-node2 44071 1727204684.39529: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204684.39536: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204684.39572: getting variables 44071 1727204684.39574: in VariableManager get_vars() 44071 1727204684.39629: Calling all_inventory to load vars for managed-node2 44071 1727204684.39632: Calling groups_inventory to load vars for managed-node2 44071 1727204684.39634: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204684.39650: Calling all_plugins_play to load vars for managed-node2 44071 1727204684.39653: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204684.39656: Calling groups_plugins_play to load vars for managed-node2 44071 1727204684.43133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204684.48520: done with get_vars() 44071 1727204684.48569: done getting variables 44071 1727204684.48888: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:04:44 -0400 (0:00:00.135) 0:01:36.806 ***** 44071 1727204684.49040: entering _queue_task() for managed-node2/package 44071 1727204684.50058: worker is 1 (out of 1 available) 44071 1727204684.50190: exiting _queue_task() for managed-node2/package 44071 1727204684.50206: done queuing things up, now waiting for results queue to drain 44071 1727204684.50208: waiting for pending results... 44071 1727204684.51050: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204684.51058: in run() - task 127b8e07-fff9-c964-7471-000000001844 44071 1727204684.51061: variable 'ansible_search_path' from source: unknown 44071 1727204684.51064: variable 'ansible_search_path' from source: unknown 44071 1727204684.51069: calling self._execute() 44071 1727204684.51109: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204684.51114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204684.51372: variable 'omit' from source: magic vars 44071 1727204684.52219: variable 'ansible_distribution_major_version' from source: facts 44071 1727204684.52245: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204684.52616: variable 'network_state' from source: role '' defaults 44071 1727204684.52629: Evaluated conditional (network_state != {}): False 44071 1727204684.52635: when evaluation is False, skipping this task 44071 1727204684.52639: _execute() done 44071 1727204684.52641: dumping result to json 44071 1727204684.52644: done dumping result, returning 44071 1727204684.52652: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-c964-7471-000000001844] 44071 1727204684.52997: sending task result for task 127b8e07-fff9-c964-7471-000000001844 44071 1727204684.53173: done sending task result for task 127b8e07-fff9-c964-7471-000000001844 44071 1727204684.53178: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204684.53315: no more pending results, returning what we have 44071 1727204684.53318: results queue empty 44071 1727204684.53319: checking for any_errors_fatal 44071 1727204684.53325: done checking for any_errors_fatal 44071 1727204684.53326: checking for max_fail_percentage 44071 1727204684.53329: done checking for max_fail_percentage 44071 1727204684.53330: checking to see if all hosts have failed and the running result is not ok 44071 1727204684.53330: done checking to see if all hosts have failed 44071 1727204684.53331: getting the remaining hosts for this loop 44071 1727204684.53332: done getting the remaining hosts for this loop 44071 1727204684.53337: getting the next task for host managed-node2 44071 1727204684.53345: done getting next task for host managed-node2 44071 1727204684.53350: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204684.53357: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204684.53386: getting variables 44071 1727204684.53388: in VariableManager get_vars() 44071 1727204684.53433: Calling all_inventory to load vars for managed-node2 44071 1727204684.53436: Calling groups_inventory to load vars for managed-node2 44071 1727204684.53438: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204684.53449: Calling all_plugins_play to load vars for managed-node2 44071 1727204684.53452: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204684.53456: Calling groups_plugins_play to load vars for managed-node2 44071 1727204684.57562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204684.61302: done with get_vars() 44071 1727204684.61586: done getting variables 44071 1727204684.61681: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:04:44 -0400 (0:00:00.126) 0:01:36.933 ***** 44071 1727204684.61723: entering _queue_task() for managed-node2/service 44071 1727204684.62734: worker is 1 (out of 1 available) 44071 1727204684.62861: exiting _queue_task() for managed-node2/service 44071 1727204684.62877: done queuing things up, now waiting for results queue to drain 44071 1727204684.62879: waiting for pending results... 44071 1727204684.63241: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204684.63777: in run() - task 127b8e07-fff9-c964-7471-000000001845 44071 1727204684.63782: variable 'ansible_search_path' from source: unknown 44071 1727204684.63785: variable 'ansible_search_path' from source: unknown 44071 1727204684.63789: calling self._execute() 44071 1727204684.63911: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204684.63916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204684.63919: variable 'omit' from source: magic vars 44071 1727204684.65457: variable 'ansible_distribution_major_version' from source: facts 44071 1727204684.65461: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204684.65942: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204684.66636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204684.72613: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204684.72939: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204684.72986: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204684.73231: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204684.73335: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204684.73426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204684.73524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204684.73636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204684.73783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204684.73799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204684.73885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204684.74301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204684.74305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204684.74324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204684.74329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204684.74348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204684.74361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204684.74717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204684.74720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204684.74724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204684.75198: variable 'network_connections' from source: include params 44071 1727204684.75202: variable 'interface' from source: play vars 44071 1727204684.75404: variable 'interface' from source: play vars 44071 1727204684.75589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204684.76377: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204684.77142: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204684.77184: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204684.77337: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204684.77562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204684.77592: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204684.77622: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204684.77895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204684.78214: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204684.78708: variable 'network_connections' from source: include params 44071 1727204684.78760: variable 'interface' from source: play vars 44071 1727204684.78853: variable 'interface' from source: play vars 44071 1727204684.78993: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204684.79003: when evaluation is False, skipping this task 44071 1727204684.79010: _execute() done 44071 1727204684.79147: dumping result to json 44071 1727204684.79151: done dumping result, returning 44071 1727204684.79153: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001845] 44071 1727204684.79156: sending task result for task 127b8e07-fff9-c964-7471-000000001845 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204684.79444: done sending task result for task 127b8e07-fff9-c964-7471-000000001845 44071 1727204684.79460: no more pending results, returning what we have 44071 1727204684.79468: results queue empty 44071 1727204684.79470: checking for any_errors_fatal 44071 1727204684.79478: done checking for any_errors_fatal 44071 1727204684.79479: checking for max_fail_percentage 44071 1727204684.79481: done checking for max_fail_percentage 44071 1727204684.79482: checking to see if all hosts have failed and the running result is not ok 44071 1727204684.79483: done checking to see if all hosts have failed 44071 1727204684.79483: getting the remaining hosts for this loop 44071 1727204684.79485: done getting the remaining hosts for this loop 44071 1727204684.79492: getting the next task for host managed-node2 44071 1727204684.79501: done getting next task for host managed-node2 44071 1727204684.79507: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204684.79512: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204684.79655: getting variables 44071 1727204684.79657: in VariableManager get_vars() 44071 1727204684.79700: Calling all_inventory to load vars for managed-node2 44071 1727204684.79703: Calling groups_inventory to load vars for managed-node2 44071 1727204684.79705: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204684.79713: WORKER PROCESS EXITING 44071 1727204684.79723: Calling all_plugins_play to load vars for managed-node2 44071 1727204684.79726: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204684.79729: Calling groups_plugins_play to load vars for managed-node2 44071 1727204684.81809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204684.85889: done with get_vars() 44071 1727204684.85941: done getting variables 44071 1727204684.86010: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:04:44 -0400 (0:00:00.243) 0:01:37.177 ***** 44071 1727204684.86057: entering _queue_task() for managed-node2/service 44071 1727204684.86540: worker is 1 (out of 1 available) 44071 1727204684.86556: exiting _queue_task() for managed-node2/service 44071 1727204684.86582: done queuing things up, now waiting for results queue to drain 44071 1727204684.86584: waiting for pending results... 44071 1727204684.86842: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204684.87034: in run() - task 127b8e07-fff9-c964-7471-000000001846 44071 1727204684.87056: variable 'ansible_search_path' from source: unknown 44071 1727204684.87064: variable 'ansible_search_path' from source: unknown 44071 1727204684.87235: calling self._execute() 44071 1727204684.87452: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204684.87457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204684.87461: variable 'omit' from source: magic vars 44071 1727204684.88084: variable 'ansible_distribution_major_version' from source: facts 44071 1727204684.88088: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204684.88275: variable 'network_provider' from source: set_fact 44071 1727204684.88300: variable 'network_state' from source: role '' defaults 44071 1727204684.88307: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44071 1727204684.88318: variable 'omit' from source: magic vars 44071 1727204684.88449: variable 'omit' from source: magic vars 44071 1727204684.88481: variable 'network_service_name' from source: role '' defaults 44071 1727204684.88665: variable 'network_service_name' from source: role '' defaults 44071 1727204684.88704: variable '__network_provider_setup' from source: role '' defaults 44071 1727204684.88715: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204684.88796: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204684.88810: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204684.88891: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204684.89469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204684.94553: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204684.94886: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204684.94891: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204684.95002: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204684.95069: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204684.95306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204684.95355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204684.95393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204684.95449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204684.95475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204684.95539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204684.95572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204684.95608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204684.95690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204684.95694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204684.96024: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204684.96180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204684.96235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204684.96256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204684.96361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204684.96368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204684.96736: variable 'ansible_python' from source: facts 44071 1727204684.96864: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204684.97094: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204684.97306: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204684.97618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204684.97688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204684.97727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204684.97859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204684.97882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204684.98180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204684.98192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204684.98195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204684.98197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204684.98247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204684.99016: variable 'network_connections' from source: include params 44071 1727204684.99031: variable 'interface' from source: play vars 44071 1727204684.99454: variable 'interface' from source: play vars 44071 1727204684.99672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204685.00030: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204685.00176: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204685.00327: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204685.00430: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204685.00533: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204685.00664: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204685.00860: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204685.00864: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204685.00870: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204685.01311: variable 'network_connections' from source: include params 44071 1727204685.01324: variable 'interface' from source: play vars 44071 1727204685.01424: variable 'interface' from source: play vars 44071 1727204685.01490: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204685.01636: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204685.02009: variable 'network_connections' from source: include params 44071 1727204685.02021: variable 'interface' from source: play vars 44071 1727204685.02187: variable 'interface' from source: play vars 44071 1727204685.02192: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204685.02297: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204685.02919: variable 'network_connections' from source: include params 44071 1727204685.02923: variable 'interface' from source: play vars 44071 1727204685.02994: variable 'interface' from source: play vars 44071 1727204685.03197: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204685.03274: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204685.03321: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204685.03422: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204685.03729: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204685.04806: variable 'network_connections' from source: include params 44071 1727204685.04822: variable 'interface' from source: play vars 44071 1727204685.04944: variable 'interface' from source: play vars 44071 1727204685.04959: variable 'ansible_distribution' from source: facts 44071 1727204685.04971: variable '__network_rh_distros' from source: role '' defaults 44071 1727204685.04981: variable 'ansible_distribution_major_version' from source: facts 44071 1727204685.05018: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204685.05550: variable 'ansible_distribution' from source: facts 44071 1727204685.05554: variable '__network_rh_distros' from source: role '' defaults 44071 1727204685.05556: variable 'ansible_distribution_major_version' from source: facts 44071 1727204685.05558: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204685.05975: variable 'ansible_distribution' from source: facts 44071 1727204685.06018: variable '__network_rh_distros' from source: role '' defaults 44071 1727204685.06063: variable 'ansible_distribution_major_version' from source: facts 44071 1727204685.06148: variable 'network_provider' from source: set_fact 44071 1727204685.06238: variable 'omit' from source: magic vars 44071 1727204685.06353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204685.06384: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204685.06415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204685.06451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204685.06470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204685.06543: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204685.06547: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204685.06550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204685.06698: Set connection var ansible_connection to ssh 44071 1727204685.06711: Set connection var ansible_timeout to 10 44071 1727204685.06721: Set connection var ansible_pipelining to False 44071 1727204685.06731: Set connection var ansible_shell_type to sh 44071 1727204685.06744: Set connection var ansible_shell_executable to /bin/sh 44071 1727204685.06775: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204685.06892: variable 'ansible_shell_executable' from source: unknown 44071 1727204685.06896: variable 'ansible_connection' from source: unknown 44071 1727204685.06898: variable 'ansible_module_compression' from source: unknown 44071 1727204685.06900: variable 'ansible_shell_type' from source: unknown 44071 1727204685.06902: variable 'ansible_shell_executable' from source: unknown 44071 1727204685.06904: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204685.06971: variable 'ansible_pipelining' from source: unknown 44071 1727204685.07079: variable 'ansible_timeout' from source: unknown 44071 1727204685.07082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204685.07207: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204685.07351: variable 'omit' from source: magic vars 44071 1727204685.07354: starting attempt loop 44071 1727204685.07356: running the handler 44071 1727204685.07496: variable 'ansible_facts' from source: unknown 44071 1727204685.09235: _low_level_execute_command(): starting 44071 1727204685.09258: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204685.10889: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204685.10997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204685.11035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204685.11054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204685.11285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204685.11900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204685.13761: stdout chunk (state=3): >>>/root <<< 44071 1727204685.14123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204685.14127: stdout chunk (state=3): >>><<< 44071 1727204685.14129: stderr chunk (state=3): >>><<< 44071 1727204685.14135: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204685.14138: _low_level_execute_command(): starting 44071 1727204685.14141: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204685.140195-49799-17233303803053 `" && echo ansible-tmp-1727204685.140195-49799-17233303803053="` echo /root/.ansible/tmp/ansible-tmp-1727204685.140195-49799-17233303803053 `" ) && sleep 0' 44071 1727204685.15576: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204685.15848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204685.15852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204685.15940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204685.17931: stdout chunk (state=3): >>>ansible-tmp-1727204685.140195-49799-17233303803053=/root/.ansible/tmp/ansible-tmp-1727204685.140195-49799-17233303803053 <<< 44071 1727204685.18146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204685.18229: stderr chunk (state=3): >>><<< 44071 1727204685.18260: stdout chunk (state=3): >>><<< 44071 1727204685.18362: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204685.140195-49799-17233303803053=/root/.ansible/tmp/ansible-tmp-1727204685.140195-49799-17233303803053 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204685.18396: variable 'ansible_module_compression' from source: unknown 44071 1727204685.18671: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44071 1727204685.18706: variable 'ansible_facts' from source: unknown 44071 1727204685.19108: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204685.140195-49799-17233303803053/AnsiballZ_systemd.py 44071 1727204685.19598: Sending initial data 44071 1727204685.19602: Sent initial data (154 bytes) 44071 1727204685.21328: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204685.21335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204685.21540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204685.21749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204685.21882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204685.23570: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204685.23724: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204685.23755: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204685.23848: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpffyvzfoa /root/.ansible/tmp/ansible-tmp-1727204685.140195-49799-17233303803053/AnsiballZ_systemd.py <<< 44071 1727204685.23852: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204685.140195-49799-17233303803053/AnsiballZ_systemd.py" <<< 44071 1727204685.23902: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpffyvzfoa" to remote "/root/.ansible/tmp/ansible-tmp-1727204685.140195-49799-17233303803053/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204685.140195-49799-17233303803053/AnsiballZ_systemd.py" <<< 44071 1727204685.27198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204685.27622: stderr chunk (state=3): >>><<< 44071 1727204685.27627: stdout chunk (state=3): >>><<< 44071 1727204685.27629: done transferring module to remote 44071 1727204685.27634: _low_level_execute_command(): starting 44071 1727204685.27637: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204685.140195-49799-17233303803053/ /root/.ansible/tmp/ansible-tmp-1727204685.140195-49799-17233303803053/AnsiballZ_systemd.py && sleep 0' 44071 1727204685.28438: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204685.28443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204685.28459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204685.28510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204685.28544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204685.28554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204685.28774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204685.30880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204685.30884: stdout chunk (state=3): >>><<< 44071 1727204685.30887: stderr chunk (state=3): >>><<< 44071 1727204685.30889: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204685.30892: _low_level_execute_command(): starting 44071 1727204685.30894: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204685.140195-49799-17233303803053/AnsiballZ_systemd.py && sleep 0' 44071 1727204685.32136: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204685.32178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204685.32197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204685.32376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204685.32495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204685.32610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204685.65186: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4521984", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3513892864", "CPUUsageNSec": "1601802000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44071 1727204685.66645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204685.66784: stderr chunk (state=3): >>><<< 44071 1727204685.67020: stdout chunk (state=3): >>><<< 44071 1727204685.67175: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4521984", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3513892864", "CPUUsageNSec": "1601802000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204685.67546: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204685.140195-49799-17233303803053/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204685.67613: _low_level_execute_command(): starting 44071 1727204685.67685: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204685.140195-49799-17233303803053/ > /dev/null 2>&1 && sleep 0' 44071 1727204685.69172: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204685.69333: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204685.69417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204685.69474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204685.69606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204685.69683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204685.71924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204685.72011: stderr chunk (state=3): >>><<< 44071 1727204685.72016: stdout chunk (state=3): >>><<< 44071 1727204685.72019: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204685.72022: handler run complete 44071 1727204685.72385: attempt loop complete, returning result 44071 1727204685.72390: _execute() done 44071 1727204685.72393: dumping result to json 44071 1727204685.72395: done dumping result, returning 44071 1727204685.72397: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-c964-7471-000000001846] 44071 1727204685.72399: sending task result for task 127b8e07-fff9-c964-7471-000000001846 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204685.73807: no more pending results, returning what we have 44071 1727204685.73812: results queue empty 44071 1727204685.73813: checking for any_errors_fatal 44071 1727204685.73822: done checking for any_errors_fatal 44071 1727204685.73823: checking for max_fail_percentage 44071 1727204685.73824: done checking for max_fail_percentage 44071 1727204685.73825: checking to see if all hosts have failed and the running result is not ok 44071 1727204685.73826: done checking to see if all hosts have failed 44071 1727204685.73827: getting the remaining hosts for this loop 44071 1727204685.73829: done getting the remaining hosts for this loop 44071 1727204685.73833: getting the next task for host managed-node2 44071 1727204685.73843: done getting next task for host managed-node2 44071 1727204685.73847: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204685.73854: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204685.73872: getting variables 44071 1727204685.73874: in VariableManager get_vars() 44071 1727204685.73919: Calling all_inventory to load vars for managed-node2 44071 1727204685.73923: Calling groups_inventory to load vars for managed-node2 44071 1727204685.73925: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204685.73937: Calling all_plugins_play to load vars for managed-node2 44071 1727204685.73940: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204685.73944: Calling groups_plugins_play to load vars for managed-node2 44071 1727204685.75034: done sending task result for task 127b8e07-fff9-c964-7471-000000001846 44071 1727204685.75039: WORKER PROCESS EXITING 44071 1727204685.78020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204685.83312: done with get_vars() 44071 1727204685.83354: done getting variables 44071 1727204685.83471: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:04:45 -0400 (0:00:00.974) 0:01:38.151 ***** 44071 1727204685.83632: entering _queue_task() for managed-node2/service 44071 1727204685.84379: worker is 1 (out of 1 available) 44071 1727204685.84616: exiting _queue_task() for managed-node2/service 44071 1727204685.84631: done queuing things up, now waiting for results queue to drain 44071 1727204685.84632: waiting for pending results... 44071 1727204685.85279: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204685.85444: in run() - task 127b8e07-fff9-c964-7471-000000001847 44071 1727204685.85539: variable 'ansible_search_path' from source: unknown 44071 1727204685.85576: variable 'ansible_search_path' from source: unknown 44071 1727204685.85629: calling self._execute() 44071 1727204685.86020: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204685.86024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204685.86027: variable 'omit' from source: magic vars 44071 1727204685.86906: variable 'ansible_distribution_major_version' from source: facts 44071 1727204685.86910: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204685.87270: variable 'network_provider' from source: set_fact 44071 1727204685.87327: Evaluated conditional (network_provider == "nm"): True 44071 1727204685.87615: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204685.88101: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204685.88274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204685.90999: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204685.91091: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204685.91142: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204685.91192: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204685.91226: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204685.91342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204685.91388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204685.91420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204685.91476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204685.91496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204685.91553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204685.91591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204685.91622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204685.91716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204685.91738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204685.91806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204685.91843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204685.91912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204685.91950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204685.92011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204685.92348: variable 'network_connections' from source: include params 44071 1727204685.92391: variable 'interface' from source: play vars 44071 1727204685.92484: variable 'interface' from source: play vars 44071 1727204685.92585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204685.92785: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204685.92832: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204685.92873: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204685.92970: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204685.92974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204685.93007: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204685.93039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204685.93074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204685.93139: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204685.93447: variable 'network_connections' from source: include params 44071 1727204685.93460: variable 'interface' from source: play vars 44071 1727204685.93570: variable 'interface' from source: play vars 44071 1727204685.93648: Evaluated conditional (__network_wpa_supplicant_required): False 44071 1727204685.93766: when evaluation is False, skipping this task 44071 1727204685.93770: _execute() done 44071 1727204685.93776: dumping result to json 44071 1727204685.93779: done dumping result, returning 44071 1727204685.93781: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-c964-7471-000000001847] 44071 1727204685.93792: sending task result for task 127b8e07-fff9-c964-7471-000000001847 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44071 1727204685.94020: no more pending results, returning what we have 44071 1727204685.94024: results queue empty 44071 1727204685.94025: checking for any_errors_fatal 44071 1727204685.94050: done checking for any_errors_fatal 44071 1727204685.94052: checking for max_fail_percentage 44071 1727204685.94053: done checking for max_fail_percentage 44071 1727204685.94055: checking to see if all hosts have failed and the running result is not ok 44071 1727204685.94056: done checking to see if all hosts have failed 44071 1727204685.94056: getting the remaining hosts for this loop 44071 1727204685.94058: done getting the remaining hosts for this loop 44071 1727204685.94064: getting the next task for host managed-node2 44071 1727204685.94095: done getting next task for host managed-node2 44071 1727204685.94104: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204685.94110: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204685.94134: getting variables 44071 1727204685.94137: in VariableManager get_vars() 44071 1727204685.94300: Calling all_inventory to load vars for managed-node2 44071 1727204685.94303: Calling groups_inventory to load vars for managed-node2 44071 1727204685.94306: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204685.94477: Calling all_plugins_play to load vars for managed-node2 44071 1727204685.94481: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204685.94484: Calling groups_plugins_play to load vars for managed-node2 44071 1727204685.95183: done sending task result for task 127b8e07-fff9-c964-7471-000000001847 44071 1727204685.95187: WORKER PROCESS EXITING 44071 1727204685.96653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204685.99645: done with get_vars() 44071 1727204685.99695: done getting variables 44071 1727204685.99771: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:04:45 -0400 (0:00:00.162) 0:01:38.314 ***** 44071 1727204685.99813: entering _queue_task() for managed-node2/service 44071 1727204686.00259: worker is 1 (out of 1 available) 44071 1727204686.00423: exiting _queue_task() for managed-node2/service 44071 1727204686.00439: done queuing things up, now waiting for results queue to drain 44071 1727204686.00440: waiting for pending results... 44071 1727204686.00654: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204686.00845: in run() - task 127b8e07-fff9-c964-7471-000000001848 44071 1727204686.00920: variable 'ansible_search_path' from source: unknown 44071 1727204686.00963: variable 'ansible_search_path' from source: unknown 44071 1727204686.00998: calling self._execute() 44071 1727204686.01212: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204686.01228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204686.01250: variable 'omit' from source: magic vars 44071 1727204686.02071: variable 'ansible_distribution_major_version' from source: facts 44071 1727204686.02075: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204686.02176: variable 'network_provider' from source: set_fact 44071 1727204686.02188: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204686.02213: when evaluation is False, skipping this task 44071 1727204686.02226: _execute() done 44071 1727204686.02236: dumping result to json 44071 1727204686.02247: done dumping result, returning 44071 1727204686.02260: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-c964-7471-000000001848] 44071 1727204686.02272: sending task result for task 127b8e07-fff9-c964-7471-000000001848 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204686.02479: no more pending results, returning what we have 44071 1727204686.02483: results queue empty 44071 1727204686.02485: checking for any_errors_fatal 44071 1727204686.02497: done checking for any_errors_fatal 44071 1727204686.02498: checking for max_fail_percentage 44071 1727204686.02500: done checking for max_fail_percentage 44071 1727204686.02501: checking to see if all hosts have failed and the running result is not ok 44071 1727204686.02502: done checking to see if all hosts have failed 44071 1727204686.02503: getting the remaining hosts for this loop 44071 1727204686.02505: done getting the remaining hosts for this loop 44071 1727204686.02510: getting the next task for host managed-node2 44071 1727204686.02521: done getting next task for host managed-node2 44071 1727204686.02525: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204686.02531: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204686.02561: getting variables 44071 1727204686.02563: in VariableManager get_vars() 44071 1727204686.02614: Calling all_inventory to load vars for managed-node2 44071 1727204686.02617: Calling groups_inventory to load vars for managed-node2 44071 1727204686.02619: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204686.02634: Calling all_plugins_play to load vars for managed-node2 44071 1727204686.02638: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204686.02641: Calling groups_plugins_play to load vars for managed-node2 44071 1727204686.03384: done sending task result for task 127b8e07-fff9-c964-7471-000000001848 44071 1727204686.03388: WORKER PROCESS EXITING 44071 1727204686.05151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204686.07504: done with get_vars() 44071 1727204686.07543: done getting variables 44071 1727204686.07620: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:04:46 -0400 (0:00:00.078) 0:01:38.393 ***** 44071 1727204686.07667: entering _queue_task() for managed-node2/copy 44071 1727204686.08092: worker is 1 (out of 1 available) 44071 1727204686.08108: exiting _queue_task() for managed-node2/copy 44071 1727204686.08128: done queuing things up, now waiting for results queue to drain 44071 1727204686.08130: waiting for pending results... 44071 1727204686.08723: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204686.09117: in run() - task 127b8e07-fff9-c964-7471-000000001849 44071 1727204686.09134: variable 'ansible_search_path' from source: unknown 44071 1727204686.09139: variable 'ansible_search_path' from source: unknown 44071 1727204686.09309: calling self._execute() 44071 1727204686.09530: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204686.09535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204686.09546: variable 'omit' from source: magic vars 44071 1727204686.10455: variable 'ansible_distribution_major_version' from source: facts 44071 1727204686.10484: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204686.10618: variable 'network_provider' from source: set_fact 44071 1727204686.10622: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204686.10627: when evaluation is False, skipping this task 44071 1727204686.10630: _execute() done 44071 1727204686.10633: dumping result to json 44071 1727204686.10722: done dumping result, returning 44071 1727204686.10727: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-c964-7471-000000001849] 44071 1727204686.10729: sending task result for task 127b8e07-fff9-c964-7471-000000001849 44071 1727204686.10816: done sending task result for task 127b8e07-fff9-c964-7471-000000001849 44071 1727204686.10819: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44071 1727204686.10880: no more pending results, returning what we have 44071 1727204686.10884: results queue empty 44071 1727204686.10886: checking for any_errors_fatal 44071 1727204686.10892: done checking for any_errors_fatal 44071 1727204686.10893: checking for max_fail_percentage 44071 1727204686.10895: done checking for max_fail_percentage 44071 1727204686.10896: checking to see if all hosts have failed and the running result is not ok 44071 1727204686.10897: done checking to see if all hosts have failed 44071 1727204686.10898: getting the remaining hosts for this loop 44071 1727204686.10899: done getting the remaining hosts for this loop 44071 1727204686.10904: getting the next task for host managed-node2 44071 1727204686.10913: done getting next task for host managed-node2 44071 1727204686.10918: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204686.10923: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204686.10947: getting variables 44071 1727204686.10948: in VariableManager get_vars() 44071 1727204686.10990: Calling all_inventory to load vars for managed-node2 44071 1727204686.10992: Calling groups_inventory to load vars for managed-node2 44071 1727204686.10995: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204686.11006: Calling all_plugins_play to load vars for managed-node2 44071 1727204686.11010: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204686.11014: Calling groups_plugins_play to load vars for managed-node2 44071 1727204686.13150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204686.17418: done with get_vars() 44071 1727204686.17466: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:04:46 -0400 (0:00:00.099) 0:01:38.492 ***** 44071 1727204686.17595: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204686.18148: worker is 1 (out of 1 available) 44071 1727204686.18164: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204686.18188: done queuing things up, now waiting for results queue to drain 44071 1727204686.18192: waiting for pending results... 44071 1727204686.18785: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204686.18939: in run() - task 127b8e07-fff9-c964-7471-00000000184a 44071 1727204686.18944: variable 'ansible_search_path' from source: unknown 44071 1727204686.18947: variable 'ansible_search_path' from source: unknown 44071 1727204686.18950: calling self._execute() 44071 1727204686.19028: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204686.19048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204686.19063: variable 'omit' from source: magic vars 44071 1727204686.19507: variable 'ansible_distribution_major_version' from source: facts 44071 1727204686.19526: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204686.19536: variable 'omit' from source: magic vars 44071 1727204686.19628: variable 'omit' from source: magic vars 44071 1727204686.19817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204686.23769: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204686.23990: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204686.24041: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204686.24085: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204686.24130: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204686.24408: variable 'network_provider' from source: set_fact 44071 1727204686.24696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204686.24735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204686.24775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204686.24861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204686.24884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204686.24981: variable 'omit' from source: magic vars 44071 1727204686.25114: variable 'omit' from source: magic vars 44071 1727204686.25291: variable 'network_connections' from source: include params 44071 1727204686.25295: variable 'interface' from source: play vars 44071 1727204686.25336: variable 'interface' from source: play vars 44071 1727204686.25555: variable 'omit' from source: magic vars 44071 1727204686.25571: variable '__lsr_ansible_managed' from source: task vars 44071 1727204686.25645: variable '__lsr_ansible_managed' from source: task vars 44071 1727204686.25880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44071 1727204686.26303: Loaded config def from plugin (lookup/template) 44071 1727204686.26306: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44071 1727204686.26313: File lookup term: get_ansible_managed.j2 44071 1727204686.26316: variable 'ansible_search_path' from source: unknown 44071 1727204686.26319: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44071 1727204686.26325: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44071 1727204686.26330: variable 'ansible_search_path' from source: unknown 44071 1727204686.48758: variable 'ansible_managed' from source: unknown 44071 1727204686.48885: variable 'omit' from source: magic vars 44071 1727204686.48908: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204686.48927: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204686.48940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204686.48952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204686.48960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204686.48977: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204686.48980: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204686.48983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204686.49052: Set connection var ansible_connection to ssh 44071 1727204686.49057: Set connection var ansible_timeout to 10 44071 1727204686.49063: Set connection var ansible_pipelining to False 44071 1727204686.49070: Set connection var ansible_shell_type to sh 44071 1727204686.49076: Set connection var ansible_shell_executable to /bin/sh 44071 1727204686.49082: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204686.49103: variable 'ansible_shell_executable' from source: unknown 44071 1727204686.49107: variable 'ansible_connection' from source: unknown 44071 1727204686.49110: variable 'ansible_module_compression' from source: unknown 44071 1727204686.49113: variable 'ansible_shell_type' from source: unknown 44071 1727204686.49117: variable 'ansible_shell_executable' from source: unknown 44071 1727204686.49119: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204686.49122: variable 'ansible_pipelining' from source: unknown 44071 1727204686.49124: variable 'ansible_timeout' from source: unknown 44071 1727204686.49126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204686.49243: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204686.49257: variable 'omit' from source: magic vars 44071 1727204686.49259: starting attempt loop 44071 1727204686.49262: running the handler 44071 1727204686.49270: _low_level_execute_command(): starting 44071 1727204686.49273: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204686.49985: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204686.49991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204686.49994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204686.50006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204686.50013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204686.50017: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204686.50020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204686.50076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204686.50079: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204686.50082: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204686.50084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204686.50087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204686.50092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204686.50101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204686.50116: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204686.50125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204686.50186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204686.50213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204686.50216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204686.50379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204686.52242: stdout chunk (state=3): >>>/root <<< 44071 1727204686.52246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204686.52469: stderr chunk (state=3): >>><<< 44071 1727204686.52475: stdout chunk (state=3): >>><<< 44071 1727204686.52651: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204686.52655: _low_level_execute_command(): starting 44071 1727204686.52661: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204686.5252178-49915-246058736295662 `" && echo ansible-tmp-1727204686.5252178-49915-246058736295662="` echo /root/.ansible/tmp/ansible-tmp-1727204686.5252178-49915-246058736295662 `" ) && sleep 0' 44071 1727204686.53423: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204686.53498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204686.53532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204686.53560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204686.53579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204686.53690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204686.55703: stdout chunk (state=3): >>>ansible-tmp-1727204686.5252178-49915-246058736295662=/root/.ansible/tmp/ansible-tmp-1727204686.5252178-49915-246058736295662 <<< 44071 1727204686.56130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204686.56134: stdout chunk (state=3): >>><<< 44071 1727204686.56136: stderr chunk (state=3): >>><<< 44071 1727204686.56140: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204686.5252178-49915-246058736295662=/root/.ansible/tmp/ansible-tmp-1727204686.5252178-49915-246058736295662 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204686.56147: variable 'ansible_module_compression' from source: unknown 44071 1727204686.56195: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44071 1727204686.56258: variable 'ansible_facts' from source: unknown 44071 1727204686.56384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204686.5252178-49915-246058736295662/AnsiballZ_network_connections.py 44071 1727204686.56588: Sending initial data 44071 1727204686.56591: Sent initial data (168 bytes) 44071 1727204686.57271: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204686.57286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204686.57347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204686.57421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204686.57454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204686.57484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204686.57589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204686.59396: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204686.59430: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204686.59519: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpf2mtxxy0 /root/.ansible/tmp/ansible-tmp-1727204686.5252178-49915-246058736295662/AnsiballZ_network_connections.py <<< 44071 1727204686.59523: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204686.5252178-49915-246058736295662/AnsiballZ_network_connections.py" <<< 44071 1727204686.59591: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpf2mtxxy0" to remote "/root/.ansible/tmp/ansible-tmp-1727204686.5252178-49915-246058736295662/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204686.5252178-49915-246058736295662/AnsiballZ_network_connections.py" <<< 44071 1727204686.61905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204686.62072: stderr chunk (state=3): >>><<< 44071 1727204686.62076: stdout chunk (state=3): >>><<< 44071 1727204686.62078: done transferring module to remote 44071 1727204686.62080: _low_level_execute_command(): starting 44071 1727204686.62083: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204686.5252178-49915-246058736295662/ /root/.ansible/tmp/ansible-tmp-1727204686.5252178-49915-246058736295662/AnsiballZ_network_connections.py && sleep 0' 44071 1727204686.63273: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204686.63278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204686.63286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204686.63298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204686.63487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204686.63498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204686.63560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204686.63563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204686.63593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204686.63693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204686.65888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204686.65893: stdout chunk (state=3): >>><<< 44071 1727204686.65912: stderr chunk (state=3): >>><<< 44071 1727204686.65916: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204686.65919: _low_level_execute_command(): starting 44071 1727204686.65954: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204686.5252178-49915-246058736295662/AnsiballZ_network_connections.py && sleep 0' 44071 1727204686.66801: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204686.66805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204686.66846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204686.66850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204686.66852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204686.66943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204686.66946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204686.66980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204686.67063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204686.96908: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2461fed0-dcf1-466d-b59f-3f5d810ecefa\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44071 1727204686.99255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204686.99351: stderr chunk (state=3): >>><<< 44071 1727204686.99357: stdout chunk (state=3): >>><<< 44071 1727204686.99431: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2461fed0-dcf1-466d-b59f-3f5d810ecefa\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204686.99440: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204686.5252178-49915-246058736295662/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204686.99449: _low_level_execute_command(): starting 44071 1727204686.99456: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204686.5252178-49915-246058736295662/ > /dev/null 2>&1 && sleep 0' 44071 1727204687.00062: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204687.00068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204687.00071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204687.00073: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204687.00075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204687.00135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204687.00158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204687.00252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204687.04206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204687.04289: stderr chunk (state=3): >>><<< 44071 1727204687.04293: stdout chunk (state=3): >>><<< 44071 1727204687.04313: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204687.04321: handler run complete 44071 1727204687.04351: attempt loop complete, returning result 44071 1727204687.04354: _execute() done 44071 1727204687.04357: dumping result to json 44071 1727204687.04361: done dumping result, returning 44071 1727204687.04371: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-c964-7471-00000000184a] 44071 1727204687.04374: sending task result for task 127b8e07-fff9-c964-7471-00000000184a 44071 1727204687.04492: done sending task result for task 127b8e07-fff9-c964-7471-00000000184a 44071 1727204687.04495: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2461fed0-dcf1-466d-b59f-3f5d810ecefa 44071 1727204687.04624: no more pending results, returning what we have 44071 1727204687.04627: results queue empty 44071 1727204687.04628: checking for any_errors_fatal 44071 1727204687.04637: done checking for any_errors_fatal 44071 1727204687.04638: checking for max_fail_percentage 44071 1727204687.04640: done checking for max_fail_percentage 44071 1727204687.04641: checking to see if all hosts have failed and the running result is not ok 44071 1727204687.04641: done checking to see if all hosts have failed 44071 1727204687.04642: getting the remaining hosts for this loop 44071 1727204687.04644: done getting the remaining hosts for this loop 44071 1727204687.04647: getting the next task for host managed-node2 44071 1727204687.04656: done getting next task for host managed-node2 44071 1727204687.04660: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204687.04665: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204687.04682: getting variables 44071 1727204687.04683: in VariableManager get_vars() 44071 1727204687.04729: Calling all_inventory to load vars for managed-node2 44071 1727204687.04731: Calling groups_inventory to load vars for managed-node2 44071 1727204687.04734: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204687.04743: Calling all_plugins_play to load vars for managed-node2 44071 1727204687.04746: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204687.04749: Calling groups_plugins_play to load vars for managed-node2 44071 1727204687.05839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204687.07039: done with get_vars() 44071 1727204687.07067: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:04:47 -0400 (0:00:00.895) 0:01:39.387 ***** 44071 1727204687.07142: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204687.07523: worker is 1 (out of 1 available) 44071 1727204687.07539: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204687.07556: done queuing things up, now waiting for results queue to drain 44071 1727204687.07559: waiting for pending results... 44071 1727204687.07915: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204687.08043: in run() - task 127b8e07-fff9-c964-7471-00000000184b 44071 1727204687.08070: variable 'ansible_search_path' from source: unknown 44071 1727204687.08076: variable 'ansible_search_path' from source: unknown 44071 1727204687.08139: calling self._execute() 44071 1727204687.08209: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204687.08213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204687.08231: variable 'omit' from source: magic vars 44071 1727204687.08602: variable 'ansible_distribution_major_version' from source: facts 44071 1727204687.08605: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204687.08702: variable 'network_state' from source: role '' defaults 44071 1727204687.08710: Evaluated conditional (network_state != {}): False 44071 1727204687.08715: when evaluation is False, skipping this task 44071 1727204687.08718: _execute() done 44071 1727204687.08721: dumping result to json 44071 1727204687.08724: done dumping result, returning 44071 1727204687.08741: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-c964-7471-00000000184b] 44071 1727204687.08743: sending task result for task 127b8e07-fff9-c964-7471-00000000184b 44071 1727204687.08855: done sending task result for task 127b8e07-fff9-c964-7471-00000000184b 44071 1727204687.08858: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204687.08967: no more pending results, returning what we have 44071 1727204687.08975: results queue empty 44071 1727204687.08976: checking for any_errors_fatal 44071 1727204687.08990: done checking for any_errors_fatal 44071 1727204687.08992: checking for max_fail_percentage 44071 1727204687.08994: done checking for max_fail_percentage 44071 1727204687.08995: checking to see if all hosts have failed and the running result is not ok 44071 1727204687.08996: done checking to see if all hosts have failed 44071 1727204687.08996: getting the remaining hosts for this loop 44071 1727204687.08998: done getting the remaining hosts for this loop 44071 1727204687.09004: getting the next task for host managed-node2 44071 1727204687.09014: done getting next task for host managed-node2 44071 1727204687.09018: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204687.09023: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204687.09045: getting variables 44071 1727204687.09047: in VariableManager get_vars() 44071 1727204687.09156: Calling all_inventory to load vars for managed-node2 44071 1727204687.09160: Calling groups_inventory to load vars for managed-node2 44071 1727204687.09162: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204687.09177: Calling all_plugins_play to load vars for managed-node2 44071 1727204687.09183: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204687.09187: Calling groups_plugins_play to load vars for managed-node2 44071 1727204687.11335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204687.12678: done with get_vars() 44071 1727204687.12708: done getting variables 44071 1727204687.12757: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:04:47 -0400 (0:00:00.056) 0:01:39.444 ***** 44071 1727204687.12792: entering _queue_task() for managed-node2/debug 44071 1727204687.13097: worker is 1 (out of 1 available) 44071 1727204687.13112: exiting _queue_task() for managed-node2/debug 44071 1727204687.13126: done queuing things up, now waiting for results queue to drain 44071 1727204687.13128: waiting for pending results... 44071 1727204687.13583: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204687.13604: in run() - task 127b8e07-fff9-c964-7471-00000000184c 44071 1727204687.13629: variable 'ansible_search_path' from source: unknown 44071 1727204687.13641: variable 'ansible_search_path' from source: unknown 44071 1727204687.13690: calling self._execute() 44071 1727204687.13799: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204687.13812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204687.13827: variable 'omit' from source: magic vars 44071 1727204687.14431: variable 'ansible_distribution_major_version' from source: facts 44071 1727204687.14454: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204687.14524: variable 'omit' from source: magic vars 44071 1727204687.14635: variable 'omit' from source: magic vars 44071 1727204687.14734: variable 'omit' from source: magic vars 44071 1727204687.14895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204687.14899: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204687.14935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204687.14960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204687.14980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204687.15016: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204687.15025: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204687.15034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204687.15184: Set connection var ansible_connection to ssh 44071 1727204687.15197: Set connection var ansible_timeout to 10 44071 1727204687.15206: Set connection var ansible_pipelining to False 44071 1727204687.15215: Set connection var ansible_shell_type to sh 44071 1727204687.15271: Set connection var ansible_shell_executable to /bin/sh 44071 1727204687.15275: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204687.15277: variable 'ansible_shell_executable' from source: unknown 44071 1727204687.15279: variable 'ansible_connection' from source: unknown 44071 1727204687.15282: variable 'ansible_module_compression' from source: unknown 44071 1727204687.15288: variable 'ansible_shell_type' from source: unknown 44071 1727204687.15297: variable 'ansible_shell_executable' from source: unknown 44071 1727204687.15304: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204687.15311: variable 'ansible_pipelining' from source: unknown 44071 1727204687.15318: variable 'ansible_timeout' from source: unknown 44071 1727204687.15325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204687.15558: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204687.15579: variable 'omit' from source: magic vars 44071 1727204687.15603: starting attempt loop 44071 1727204687.15606: running the handler 44071 1727204687.15872: variable '__network_connections_result' from source: set_fact 44071 1727204687.15875: handler run complete 44071 1727204687.15881: attempt loop complete, returning result 44071 1727204687.15888: _execute() done 44071 1727204687.15897: dumping result to json 44071 1727204687.15904: done dumping result, returning 44071 1727204687.15917: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-c964-7471-00000000184c] 44071 1727204687.15926: sending task result for task 127b8e07-fff9-c964-7471-00000000184c ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2461fed0-dcf1-466d-b59f-3f5d810ecefa" ] } 44071 1727204687.16133: no more pending results, returning what we have 44071 1727204687.16137: results queue empty 44071 1727204687.16138: checking for any_errors_fatal 44071 1727204687.16146: done checking for any_errors_fatal 44071 1727204687.16147: checking for max_fail_percentage 44071 1727204687.16148: done checking for max_fail_percentage 44071 1727204687.16149: checking to see if all hosts have failed and the running result is not ok 44071 1727204687.16150: done checking to see if all hosts have failed 44071 1727204687.16151: getting the remaining hosts for this loop 44071 1727204687.16152: done getting the remaining hosts for this loop 44071 1727204687.16157: getting the next task for host managed-node2 44071 1727204687.16168: done getting next task for host managed-node2 44071 1727204687.16173: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204687.16179: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204687.16193: getting variables 44071 1727204687.16195: in VariableManager get_vars() 44071 1727204687.16360: Calling all_inventory to load vars for managed-node2 44071 1727204687.16362: Calling groups_inventory to load vars for managed-node2 44071 1727204687.16369: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204687.16377: done sending task result for task 127b8e07-fff9-c964-7471-00000000184c 44071 1727204687.16381: WORKER PROCESS EXITING 44071 1727204687.16392: Calling all_plugins_play to load vars for managed-node2 44071 1727204687.16396: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204687.16399: Calling groups_plugins_play to load vars for managed-node2 44071 1727204687.19599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204687.32009: done with get_vars() 44071 1727204687.32058: done getting variables 44071 1727204687.32322: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:04:47 -0400 (0:00:00.195) 0:01:39.640 ***** 44071 1727204687.32371: entering _queue_task() for managed-node2/debug 44071 1727204687.32899: worker is 1 (out of 1 available) 44071 1727204687.32915: exiting _queue_task() for managed-node2/debug 44071 1727204687.32928: done queuing things up, now waiting for results queue to drain 44071 1727204687.32930: waiting for pending results... 44071 1727204687.33693: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204687.33956: in run() - task 127b8e07-fff9-c964-7471-00000000184d 44071 1727204687.33988: variable 'ansible_search_path' from source: unknown 44071 1727204687.33998: variable 'ansible_search_path' from source: unknown 44071 1727204687.34056: calling self._execute() 44071 1727204687.34183: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204687.34199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204687.34214: variable 'omit' from source: magic vars 44071 1727204687.34711: variable 'ansible_distribution_major_version' from source: facts 44071 1727204687.34745: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204687.34758: variable 'omit' from source: magic vars 44071 1727204687.34860: variable 'omit' from source: magic vars 44071 1727204687.34937: variable 'omit' from source: magic vars 44071 1727204687.35011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204687.35062: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204687.35096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204687.35224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204687.35228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204687.35231: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204687.35237: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204687.35240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204687.35360: Set connection var ansible_connection to ssh 44071 1727204687.35375: Set connection var ansible_timeout to 10 44071 1727204687.35385: Set connection var ansible_pipelining to False 44071 1727204687.35394: Set connection var ansible_shell_type to sh 44071 1727204687.35405: Set connection var ansible_shell_executable to /bin/sh 44071 1727204687.35418: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204687.35461: variable 'ansible_shell_executable' from source: unknown 44071 1727204687.35474: variable 'ansible_connection' from source: unknown 44071 1727204687.35482: variable 'ansible_module_compression' from source: unknown 44071 1727204687.35551: variable 'ansible_shell_type' from source: unknown 44071 1727204687.35554: variable 'ansible_shell_executable' from source: unknown 44071 1727204687.35557: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204687.35560: variable 'ansible_pipelining' from source: unknown 44071 1727204687.35563: variable 'ansible_timeout' from source: unknown 44071 1727204687.35568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204687.35711: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204687.35730: variable 'omit' from source: magic vars 44071 1727204687.35744: starting attempt loop 44071 1727204687.35751: running the handler 44071 1727204687.35815: variable '__network_connections_result' from source: set_fact 44071 1727204687.35924: variable '__network_connections_result' from source: set_fact 44071 1727204687.36091: handler run complete 44071 1727204687.36199: attempt loop complete, returning result 44071 1727204687.36203: _execute() done 44071 1727204687.36206: dumping result to json 44071 1727204687.36208: done dumping result, returning 44071 1727204687.36211: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-c964-7471-00000000184d] 44071 1727204687.36213: sending task result for task 127b8e07-fff9-c964-7471-00000000184d ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2461fed0-dcf1-466d-b59f-3f5d810ecefa\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2461fed0-dcf1-466d-b59f-3f5d810ecefa" ] } } 44071 1727204687.36486: no more pending results, returning what we have 44071 1727204687.36491: results queue empty 44071 1727204687.36492: checking for any_errors_fatal 44071 1727204687.36500: done checking for any_errors_fatal 44071 1727204687.36501: checking for max_fail_percentage 44071 1727204687.36503: done checking for max_fail_percentage 44071 1727204687.36504: checking to see if all hosts have failed and the running result is not ok 44071 1727204687.36505: done checking to see if all hosts have failed 44071 1727204687.36505: getting the remaining hosts for this loop 44071 1727204687.36507: done getting the remaining hosts for this loop 44071 1727204687.36514: getting the next task for host managed-node2 44071 1727204687.36525: done getting next task for host managed-node2 44071 1727204687.36529: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204687.36537: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204687.36556: getting variables 44071 1727204687.36557: in VariableManager get_vars() 44071 1727204687.36810: Calling all_inventory to load vars for managed-node2 44071 1727204687.36813: Calling groups_inventory to load vars for managed-node2 44071 1727204687.36824: done sending task result for task 127b8e07-fff9-c964-7471-00000000184d 44071 1727204687.36840: WORKER PROCESS EXITING 44071 1727204687.36835: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204687.36854: Calling all_plugins_play to load vars for managed-node2 44071 1727204687.36858: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204687.36861: Calling groups_plugins_play to load vars for managed-node2 44071 1727204687.40017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204687.43231: done with get_vars() 44071 1727204687.43281: done getting variables 44071 1727204687.43353: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:04:47 -0400 (0:00:00.110) 0:01:39.750 ***** 44071 1727204687.43397: entering _queue_task() for managed-node2/debug 44071 1727204687.44602: worker is 1 (out of 1 available) 44071 1727204687.44625: exiting _queue_task() for managed-node2/debug 44071 1727204687.44641: done queuing things up, now waiting for results queue to drain 44071 1727204687.44643: waiting for pending results... 44071 1727204687.45290: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204687.45615: in run() - task 127b8e07-fff9-c964-7471-00000000184e 44071 1727204687.45745: variable 'ansible_search_path' from source: unknown 44071 1727204687.45750: variable 'ansible_search_path' from source: unknown 44071 1727204687.45795: calling self._execute() 44071 1727204687.46073: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204687.46077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204687.46079: variable 'omit' from source: magic vars 44071 1727204687.47220: variable 'ansible_distribution_major_version' from source: facts 44071 1727204687.47225: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204687.47476: variable 'network_state' from source: role '' defaults 44071 1727204687.47492: Evaluated conditional (network_state != {}): False 44071 1727204687.47496: when evaluation is False, skipping this task 44071 1727204687.47499: _execute() done 44071 1727204687.47501: dumping result to json 44071 1727204687.47504: done dumping result, returning 44071 1727204687.47519: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-c964-7471-00000000184e] 44071 1727204687.47522: sending task result for task 127b8e07-fff9-c964-7471-00000000184e 44071 1727204687.47848: done sending task result for task 127b8e07-fff9-c964-7471-00000000184e 44071 1727204687.47852: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 44071 1727204687.47905: no more pending results, returning what we have 44071 1727204687.47909: results queue empty 44071 1727204687.47910: checking for any_errors_fatal 44071 1727204687.47922: done checking for any_errors_fatal 44071 1727204687.47923: checking for max_fail_percentage 44071 1727204687.47924: done checking for max_fail_percentage 44071 1727204687.47925: checking to see if all hosts have failed and the running result is not ok 44071 1727204687.47926: done checking to see if all hosts have failed 44071 1727204687.47927: getting the remaining hosts for this loop 44071 1727204687.47929: done getting the remaining hosts for this loop 44071 1727204687.47937: getting the next task for host managed-node2 44071 1727204687.47946: done getting next task for host managed-node2 44071 1727204687.48063: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204687.48071: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204687.48092: getting variables 44071 1727204687.48094: in VariableManager get_vars() 44071 1727204687.48139: Calling all_inventory to load vars for managed-node2 44071 1727204687.48142: Calling groups_inventory to load vars for managed-node2 44071 1727204687.48144: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204687.48154: Calling all_plugins_play to load vars for managed-node2 44071 1727204687.48157: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204687.48159: Calling groups_plugins_play to load vars for managed-node2 44071 1727204687.53101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204687.58043: done with get_vars() 44071 1727204687.58216: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:04:47 -0400 (0:00:00.149) 0:01:39.900 ***** 44071 1727204687.58383: entering _queue_task() for managed-node2/ping 44071 1727204687.59428: worker is 1 (out of 1 available) 44071 1727204687.59501: exiting _queue_task() for managed-node2/ping 44071 1727204687.59516: done queuing things up, now waiting for results queue to drain 44071 1727204687.59518: waiting for pending results... 44071 1727204687.59927: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204687.60338: in run() - task 127b8e07-fff9-c964-7471-00000000184f 44071 1727204687.60342: variable 'ansible_search_path' from source: unknown 44071 1727204687.60345: variable 'ansible_search_path' from source: unknown 44071 1727204687.60348: calling self._execute() 44071 1727204687.60706: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204687.60710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204687.60713: variable 'omit' from source: magic vars 44071 1727204687.61627: variable 'ansible_distribution_major_version' from source: facts 44071 1727204687.61635: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204687.61639: variable 'omit' from source: magic vars 44071 1727204687.61777: variable 'omit' from source: magic vars 44071 1727204687.61782: variable 'omit' from source: magic vars 44071 1727204687.61848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204687.61852: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204687.61869: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204687.61910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204687.61924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204687.61957: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204687.61962: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204687.61967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204687.62288: Set connection var ansible_connection to ssh 44071 1727204687.62330: Set connection var ansible_timeout to 10 44071 1727204687.62336: Set connection var ansible_pipelining to False 44071 1727204687.62338: Set connection var ansible_shell_type to sh 44071 1727204687.62341: Set connection var ansible_shell_executable to /bin/sh 44071 1727204687.62343: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204687.62350: variable 'ansible_shell_executable' from source: unknown 44071 1727204687.62353: variable 'ansible_connection' from source: unknown 44071 1727204687.62356: variable 'ansible_module_compression' from source: unknown 44071 1727204687.62358: variable 'ansible_shell_type' from source: unknown 44071 1727204687.62361: variable 'ansible_shell_executable' from source: unknown 44071 1727204687.62366: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204687.62370: variable 'ansible_pipelining' from source: unknown 44071 1727204687.62375: variable 'ansible_timeout' from source: unknown 44071 1727204687.62379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204687.62817: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204687.62831: variable 'omit' from source: magic vars 44071 1727204687.62837: starting attempt loop 44071 1727204687.62839: running the handler 44071 1727204687.62854: _low_level_execute_command(): starting 44071 1727204687.62863: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204687.63989: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204687.64013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204687.64093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204687.64150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204687.64180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204687.64199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204687.64350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204687.66308: stdout chunk (state=3): >>>/root <<< 44071 1727204687.66404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204687.66410: stdout chunk (state=3): >>><<< 44071 1727204687.66421: stderr chunk (state=3): >>><<< 44071 1727204687.66449: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204687.66467: _low_level_execute_command(): starting 44071 1727204687.66476: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204687.6644945-49973-127479769451389 `" && echo ansible-tmp-1727204687.6644945-49973-127479769451389="` echo /root/.ansible/tmp/ansible-tmp-1727204687.6644945-49973-127479769451389 `" ) && sleep 0' 44071 1727204687.67611: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204687.67618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204687.67864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204687.67871: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204687.67874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204687.68000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204687.68017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204687.70016: stdout chunk (state=3): >>>ansible-tmp-1727204687.6644945-49973-127479769451389=/root/.ansible/tmp/ansible-tmp-1727204687.6644945-49973-127479769451389 <<< 44071 1727204687.70249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204687.70254: stdout chunk (state=3): >>><<< 44071 1727204687.70257: stderr chunk (state=3): >>><<< 44071 1727204687.70473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204687.6644945-49973-127479769451389=/root/.ansible/tmp/ansible-tmp-1727204687.6644945-49973-127479769451389 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204687.70477: variable 'ansible_module_compression' from source: unknown 44071 1727204687.70480: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44071 1727204687.70482: variable 'ansible_facts' from source: unknown 44071 1727204687.70551: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204687.6644945-49973-127479769451389/AnsiballZ_ping.py 44071 1727204687.70739: Sending initial data 44071 1727204687.70841: Sent initial data (153 bytes) 44071 1727204687.72286: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204687.72318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204687.72388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204687.72510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204687.74418: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204687.74469: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204687.74568: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpx75sub3t /root/.ansible/tmp/ansible-tmp-1727204687.6644945-49973-127479769451389/AnsiballZ_ping.py <<< 44071 1727204687.74571: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204687.6644945-49973-127479769451389/AnsiballZ_ping.py" <<< 44071 1727204687.74661: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpx75sub3t" to remote "/root/.ansible/tmp/ansible-tmp-1727204687.6644945-49973-127479769451389/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204687.6644945-49973-127479769451389/AnsiballZ_ping.py" <<< 44071 1727204687.76135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204687.76437: stderr chunk (state=3): >>><<< 44071 1727204687.76442: stdout chunk (state=3): >>><<< 44071 1727204687.76444: done transferring module to remote 44071 1727204687.76447: _low_level_execute_command(): starting 44071 1727204687.76449: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204687.6644945-49973-127479769451389/ /root/.ansible/tmp/ansible-tmp-1727204687.6644945-49973-127479769451389/AnsiballZ_ping.py && sleep 0' 44071 1727204687.77032: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204687.77050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204687.77070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204687.77091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204687.77108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204687.77193: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204687.77229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204687.77256: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204687.77288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204687.77498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204687.79498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204687.79503: stdout chunk (state=3): >>><<< 44071 1727204687.79505: stderr chunk (state=3): >>><<< 44071 1727204687.79668: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204687.79672: _low_level_execute_command(): starting 44071 1727204687.79676: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204687.6644945-49973-127479769451389/AnsiballZ_ping.py && sleep 0' 44071 1727204687.81091: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204687.81271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204687.81301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204687.81401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204687.97755: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44071 1727204687.99217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204687.99221: stdout chunk (state=3): >>><<< 44071 1727204687.99224: stderr chunk (state=3): >>><<< 44071 1727204687.99249: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204687.99324: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204687.6644945-49973-127479769451389/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204687.99329: _low_level_execute_command(): starting 44071 1727204687.99331: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204687.6644945-49973-127479769451389/ > /dev/null 2>&1 && sleep 0' 44071 1727204687.99935: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204687.99948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204687.99959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204687.99986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204687.99989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204687.99998: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204688.00007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204688.00086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204688.00091: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204688.00094: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204688.00097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204688.00099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204688.00101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204688.00103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204688.00106: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204688.00108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204688.00202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204688.00206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204688.00214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204688.00327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204688.02280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204688.02334: stderr chunk (state=3): >>><<< 44071 1727204688.02338: stdout chunk (state=3): >>><<< 44071 1727204688.02356: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204688.02363: handler run complete 44071 1727204688.02393: attempt loop complete, returning result 44071 1727204688.02397: _execute() done 44071 1727204688.02400: dumping result to json 44071 1727204688.02402: done dumping result, returning 44071 1727204688.02472: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-c964-7471-00000000184f] 44071 1727204688.02476: sending task result for task 127b8e07-fff9-c964-7471-00000000184f 44071 1727204688.02548: done sending task result for task 127b8e07-fff9-c964-7471-00000000184f 44071 1727204688.02552: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 44071 1727204688.02635: no more pending results, returning what we have 44071 1727204688.02640: results queue empty 44071 1727204688.02641: checking for any_errors_fatal 44071 1727204688.02649: done checking for any_errors_fatal 44071 1727204688.02649: checking for max_fail_percentage 44071 1727204688.02651: done checking for max_fail_percentage 44071 1727204688.02652: checking to see if all hosts have failed and the running result is not ok 44071 1727204688.02652: done checking to see if all hosts have failed 44071 1727204688.02653: getting the remaining hosts for this loop 44071 1727204688.02655: done getting the remaining hosts for this loop 44071 1727204688.02659: getting the next task for host managed-node2 44071 1727204688.02674: done getting next task for host managed-node2 44071 1727204688.02677: ^ task is: TASK: meta (role_complete) 44071 1727204688.02684: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204688.02698: getting variables 44071 1727204688.02700: in VariableManager get_vars() 44071 1727204688.02752: Calling all_inventory to load vars for managed-node2 44071 1727204688.02755: Calling groups_inventory to load vars for managed-node2 44071 1727204688.02758: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204688.02992: Calling all_plugins_play to load vars for managed-node2 44071 1727204688.02996: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204688.03000: Calling groups_plugins_play to load vars for managed-node2 44071 1727204688.05332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204688.06683: done with get_vars() 44071 1727204688.06714: done getting variables 44071 1727204688.06791: done queuing things up, now waiting for results queue to drain 44071 1727204688.06793: results queue empty 44071 1727204688.06793: checking for any_errors_fatal 44071 1727204688.06796: done checking for any_errors_fatal 44071 1727204688.06797: checking for max_fail_percentage 44071 1727204688.06798: done checking for max_fail_percentage 44071 1727204688.06799: checking to see if all hosts have failed and the running result is not ok 44071 1727204688.06799: done checking to see if all hosts have failed 44071 1727204688.06800: getting the remaining hosts for this loop 44071 1727204688.06801: done getting the remaining hosts for this loop 44071 1727204688.06803: getting the next task for host managed-node2 44071 1727204688.06807: done getting next task for host managed-node2 44071 1727204688.06809: ^ task is: TASK: Show result 44071 1727204688.06811: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204688.06813: getting variables 44071 1727204688.06815: in VariableManager get_vars() 44071 1727204688.06825: Calling all_inventory to load vars for managed-node2 44071 1727204688.06827: Calling groups_inventory to load vars for managed-node2 44071 1727204688.06828: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204688.06833: Calling all_plugins_play to load vars for managed-node2 44071 1727204688.06835: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204688.06837: Calling groups_plugins_play to load vars for managed-node2 44071 1727204688.07746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204688.09519: done with get_vars() 44071 1727204688.09551: done getting variables 44071 1727204688.09594: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Tuesday 24 September 2024 15:04:48 -0400 (0:00:00.512) 0:01:40.412 ***** 44071 1727204688.09625: entering _queue_task() for managed-node2/debug 44071 1727204688.09939: worker is 1 (out of 1 available) 44071 1727204688.09956: exiting _queue_task() for managed-node2/debug 44071 1727204688.09974: done queuing things up, now waiting for results queue to drain 44071 1727204688.09976: waiting for pending results... 44071 1727204688.10206: running TaskExecutor() for managed-node2/TASK: Show result 44071 1727204688.10292: in run() - task 127b8e07-fff9-c964-7471-0000000017d1 44071 1727204688.10306: variable 'ansible_search_path' from source: unknown 44071 1727204688.10309: variable 'ansible_search_path' from source: unknown 44071 1727204688.10349: calling self._execute() 44071 1727204688.10441: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204688.10450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204688.10456: variable 'omit' from source: magic vars 44071 1727204688.10792: variable 'ansible_distribution_major_version' from source: facts 44071 1727204688.10804: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204688.10810: variable 'omit' from source: magic vars 44071 1727204688.10856: variable 'omit' from source: magic vars 44071 1727204688.10892: variable 'omit' from source: magic vars 44071 1727204688.10930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204688.10962: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204688.10984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204688.11003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204688.11013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204688.11039: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204688.11043: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204688.11047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204688.11129: Set connection var ansible_connection to ssh 44071 1727204688.11136: Set connection var ansible_timeout to 10 44071 1727204688.11140: Set connection var ansible_pipelining to False 44071 1727204688.11146: Set connection var ansible_shell_type to sh 44071 1727204688.11151: Set connection var ansible_shell_executable to /bin/sh 44071 1727204688.11158: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204688.11180: variable 'ansible_shell_executable' from source: unknown 44071 1727204688.11183: variable 'ansible_connection' from source: unknown 44071 1727204688.11186: variable 'ansible_module_compression' from source: unknown 44071 1727204688.11188: variable 'ansible_shell_type' from source: unknown 44071 1727204688.11191: variable 'ansible_shell_executable' from source: unknown 44071 1727204688.11195: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204688.11197: variable 'ansible_pipelining' from source: unknown 44071 1727204688.11200: variable 'ansible_timeout' from source: unknown 44071 1727204688.11207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204688.11323: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204688.11337: variable 'omit' from source: magic vars 44071 1727204688.11340: starting attempt loop 44071 1727204688.11343: running the handler 44071 1727204688.11383: variable '__network_connections_result' from source: set_fact 44071 1727204688.11458: variable '__network_connections_result' from source: set_fact 44071 1727204688.11554: handler run complete 44071 1727204688.11576: attempt loop complete, returning result 44071 1727204688.11579: _execute() done 44071 1727204688.11582: dumping result to json 44071 1727204688.11587: done dumping result, returning 44071 1727204688.11594: done running TaskExecutor() for managed-node2/TASK: Show result [127b8e07-fff9-c964-7471-0000000017d1] 44071 1727204688.11598: sending task result for task 127b8e07-fff9-c964-7471-0000000017d1 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2461fed0-dcf1-466d-b59f-3f5d810ecefa\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2461fed0-dcf1-466d-b59f-3f5d810ecefa" ] } } 44071 1727204688.11800: no more pending results, returning what we have 44071 1727204688.11803: results queue empty 44071 1727204688.11804: checking for any_errors_fatal 44071 1727204688.11806: done checking for any_errors_fatal 44071 1727204688.11806: checking for max_fail_percentage 44071 1727204688.11808: done checking for max_fail_percentage 44071 1727204688.11809: checking to see if all hosts have failed and the running result is not ok 44071 1727204688.11810: done checking to see if all hosts have failed 44071 1727204688.11811: getting the remaining hosts for this loop 44071 1727204688.11812: done getting the remaining hosts for this loop 44071 1727204688.11817: getting the next task for host managed-node2 44071 1727204688.11830: done getting next task for host managed-node2 44071 1727204688.11835: ^ task is: TASK: Include network role 44071 1727204688.11839: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204688.11843: getting variables 44071 1727204688.11845: in VariableManager get_vars() 44071 1727204688.11955: Calling all_inventory to load vars for managed-node2 44071 1727204688.11958: Calling groups_inventory to load vars for managed-node2 44071 1727204688.11962: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204688.11972: done sending task result for task 127b8e07-fff9-c964-7471-0000000017d1 44071 1727204688.11975: WORKER PROCESS EXITING 44071 1727204688.12094: Calling all_plugins_play to load vars for managed-node2 44071 1727204688.12098: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204688.12102: Calling groups_plugins_play to load vars for managed-node2 44071 1727204688.14376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204688.16879: done with get_vars() 44071 1727204688.16925: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Tuesday 24 September 2024 15:04:48 -0400 (0:00:00.074) 0:01:40.486 ***** 44071 1727204688.17054: entering _queue_task() for managed-node2/include_role 44071 1727204688.17695: worker is 1 (out of 1 available) 44071 1727204688.17708: exiting _queue_task() for managed-node2/include_role 44071 1727204688.17719: done queuing things up, now waiting for results queue to drain 44071 1727204688.17721: waiting for pending results... 44071 1727204688.17890: running TaskExecutor() for managed-node2/TASK: Include network role 44071 1727204688.18083: in run() - task 127b8e07-fff9-c964-7471-0000000017d5 44071 1727204688.18107: variable 'ansible_search_path' from source: unknown 44071 1727204688.18121: variable 'ansible_search_path' from source: unknown 44071 1727204688.18180: calling self._execute() 44071 1727204688.18342: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204688.18346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204688.18349: variable 'omit' from source: magic vars 44071 1727204688.18836: variable 'ansible_distribution_major_version' from source: facts 44071 1727204688.18929: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204688.18935: _execute() done 44071 1727204688.18937: dumping result to json 44071 1727204688.18940: done dumping result, returning 44071 1727204688.18943: done running TaskExecutor() for managed-node2/TASK: Include network role [127b8e07-fff9-c964-7471-0000000017d5] 44071 1727204688.18945: sending task result for task 127b8e07-fff9-c964-7471-0000000017d5 44071 1727204688.19204: no more pending results, returning what we have 44071 1727204688.19212: in VariableManager get_vars() 44071 1727204688.19388: Calling all_inventory to load vars for managed-node2 44071 1727204688.19392: Calling groups_inventory to load vars for managed-node2 44071 1727204688.19396: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204688.19413: Calling all_plugins_play to load vars for managed-node2 44071 1727204688.19417: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204688.19421: Calling groups_plugins_play to load vars for managed-node2 44071 1727204688.19988: done sending task result for task 127b8e07-fff9-c964-7471-0000000017d5 44071 1727204688.19993: WORKER PROCESS EXITING 44071 1727204688.21799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204688.24279: done with get_vars() 44071 1727204688.24319: variable 'ansible_search_path' from source: unknown 44071 1727204688.24320: variable 'ansible_search_path' from source: unknown 44071 1727204688.24518: variable 'omit' from source: magic vars 44071 1727204688.24575: variable 'omit' from source: magic vars 44071 1727204688.24598: variable 'omit' from source: magic vars 44071 1727204688.24602: we have included files to process 44071 1727204688.24603: generating all_blocks data 44071 1727204688.24605: done generating all_blocks data 44071 1727204688.24612: processing included file: fedora.linux_system_roles.network 44071 1727204688.24638: in VariableManager get_vars() 44071 1727204688.24657: done with get_vars() 44071 1727204688.24700: in VariableManager get_vars() 44071 1727204688.24722: done with get_vars() 44071 1727204688.24777: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44071 1727204688.24951: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44071 1727204688.25057: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44071 1727204688.25693: in VariableManager get_vars() 44071 1727204688.25722: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204688.28349: iterating over new_blocks loaded from include file 44071 1727204688.28353: in VariableManager get_vars() 44071 1727204688.28391: done with get_vars() 44071 1727204688.28394: filtering new block on tags 44071 1727204688.28791: done filtering new block on tags 44071 1727204688.28797: in VariableManager get_vars() 44071 1727204688.28830: done with get_vars() 44071 1727204688.28834: filtering new block on tags 44071 1727204688.28857: done filtering new block on tags 44071 1727204688.28860: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 44071 1727204688.28868: extending task lists for all hosts with included blocks 44071 1727204688.29008: done extending task lists 44071 1727204688.29010: done processing included files 44071 1727204688.29010: results queue empty 44071 1727204688.29011: checking for any_errors_fatal 44071 1727204688.29016: done checking for any_errors_fatal 44071 1727204688.29017: checking for max_fail_percentage 44071 1727204688.29018: done checking for max_fail_percentage 44071 1727204688.29019: checking to see if all hosts have failed and the running result is not ok 44071 1727204688.29026: done checking to see if all hosts have failed 44071 1727204688.29027: getting the remaining hosts for this loop 44071 1727204688.29028: done getting the remaining hosts for this loop 44071 1727204688.29031: getting the next task for host managed-node2 44071 1727204688.29039: done getting next task for host managed-node2 44071 1727204688.29042: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204688.29046: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204688.29060: getting variables 44071 1727204688.29061: in VariableManager get_vars() 44071 1727204688.29081: Calling all_inventory to load vars for managed-node2 44071 1727204688.29084: Calling groups_inventory to load vars for managed-node2 44071 1727204688.29086: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204688.29093: Calling all_plugins_play to load vars for managed-node2 44071 1727204688.29095: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204688.29098: Calling groups_plugins_play to load vars for managed-node2 44071 1727204688.30825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204688.33883: done with get_vars() 44071 1727204688.33936: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:04:48 -0400 (0:00:00.169) 0:01:40.656 ***** 44071 1727204688.34041: entering _queue_task() for managed-node2/include_tasks 44071 1727204688.34578: worker is 1 (out of 1 available) 44071 1727204688.34592: exiting _queue_task() for managed-node2/include_tasks 44071 1727204688.34605: done queuing things up, now waiting for results queue to drain 44071 1727204688.34607: waiting for pending results... 44071 1727204688.34973: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204688.35213: in run() - task 127b8e07-fff9-c964-7471-0000000019bf 44071 1727204688.35304: variable 'ansible_search_path' from source: unknown 44071 1727204688.35308: variable 'ansible_search_path' from source: unknown 44071 1727204688.35311: calling self._execute() 44071 1727204688.35453: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204688.35468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204688.35486: variable 'omit' from source: magic vars 44071 1727204688.35970: variable 'ansible_distribution_major_version' from source: facts 44071 1727204688.36062: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204688.36070: _execute() done 44071 1727204688.36073: dumping result to json 44071 1727204688.36075: done dumping result, returning 44071 1727204688.36078: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-c964-7471-0000000019bf] 44071 1727204688.36080: sending task result for task 127b8e07-fff9-c964-7471-0000000019bf 44071 1727204688.36351: no more pending results, returning what we have 44071 1727204688.36359: in VariableManager get_vars() 44071 1727204688.36512: Calling all_inventory to load vars for managed-node2 44071 1727204688.36517: Calling groups_inventory to load vars for managed-node2 44071 1727204688.36520: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204688.36542: Calling all_plugins_play to load vars for managed-node2 44071 1727204688.36547: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204688.36551: Calling groups_plugins_play to load vars for managed-node2 44071 1727204688.37134: done sending task result for task 127b8e07-fff9-c964-7471-0000000019bf 44071 1727204688.37144: WORKER PROCESS EXITING 44071 1727204688.39348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204688.40749: done with get_vars() 44071 1727204688.40784: variable 'ansible_search_path' from source: unknown 44071 1727204688.40785: variable 'ansible_search_path' from source: unknown 44071 1727204688.40824: we have included files to process 44071 1727204688.40825: generating all_blocks data 44071 1727204688.40827: done generating all_blocks data 44071 1727204688.40830: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204688.40831: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204688.40833: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204688.41287: done processing included file 44071 1727204688.41289: iterating over new_blocks loaded from include file 44071 1727204688.41290: in VariableManager get_vars() 44071 1727204688.41312: done with get_vars() 44071 1727204688.41313: filtering new block on tags 44071 1727204688.41341: done filtering new block on tags 44071 1727204688.41344: in VariableManager get_vars() 44071 1727204688.41367: done with get_vars() 44071 1727204688.41369: filtering new block on tags 44071 1727204688.41402: done filtering new block on tags 44071 1727204688.41404: in VariableManager get_vars() 44071 1727204688.41419: done with get_vars() 44071 1727204688.41420: filtering new block on tags 44071 1727204688.41452: done filtering new block on tags 44071 1727204688.41454: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 44071 1727204688.41459: extending task lists for all hosts with included blocks 44071 1727204688.43025: done extending task lists 44071 1727204688.43027: done processing included files 44071 1727204688.43028: results queue empty 44071 1727204688.43029: checking for any_errors_fatal 44071 1727204688.43035: done checking for any_errors_fatal 44071 1727204688.43036: checking for max_fail_percentage 44071 1727204688.43037: done checking for max_fail_percentage 44071 1727204688.43038: checking to see if all hosts have failed and the running result is not ok 44071 1727204688.43039: done checking to see if all hosts have failed 44071 1727204688.43039: getting the remaining hosts for this loop 44071 1727204688.43041: done getting the remaining hosts for this loop 44071 1727204688.43044: getting the next task for host managed-node2 44071 1727204688.43049: done getting next task for host managed-node2 44071 1727204688.43052: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204688.43057: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204688.43072: getting variables 44071 1727204688.43074: in VariableManager get_vars() 44071 1727204688.43094: Calling all_inventory to load vars for managed-node2 44071 1727204688.43096: Calling groups_inventory to load vars for managed-node2 44071 1727204688.43099: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204688.43105: Calling all_plugins_play to load vars for managed-node2 44071 1727204688.43108: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204688.43111: Calling groups_plugins_play to load vars for managed-node2 44071 1727204688.44481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204688.45731: done with get_vars() 44071 1727204688.45764: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:04:48 -0400 (0:00:00.117) 0:01:40.774 ***** 44071 1727204688.45841: entering _queue_task() for managed-node2/setup 44071 1727204688.46163: worker is 1 (out of 1 available) 44071 1727204688.46179: exiting _queue_task() for managed-node2/setup 44071 1727204688.46194: done queuing things up, now waiting for results queue to drain 44071 1727204688.46196: waiting for pending results... 44071 1727204688.46406: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204688.46541: in run() - task 127b8e07-fff9-c964-7471-000000001a16 44071 1727204688.46573: variable 'ansible_search_path' from source: unknown 44071 1727204688.46580: variable 'ansible_search_path' from source: unknown 44071 1727204688.46626: calling self._execute() 44071 1727204688.46973: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204688.46977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204688.46981: variable 'omit' from source: magic vars 44071 1727204688.47244: variable 'ansible_distribution_major_version' from source: facts 44071 1727204688.47268: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204688.47568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204688.49385: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204688.49451: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204688.49485: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204688.49517: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204688.49539: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204688.49611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204688.49638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204688.49658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204688.49689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204688.49700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204688.49748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204688.49767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204688.49785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204688.49812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204688.49825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204688.49955: variable '__network_required_facts' from source: role '' defaults 44071 1727204688.49964: variable 'ansible_facts' from source: unknown 44071 1727204688.50712: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44071 1727204688.50718: when evaluation is False, skipping this task 44071 1727204688.50720: _execute() done 44071 1727204688.50722: dumping result to json 44071 1727204688.50725: done dumping result, returning 44071 1727204688.50728: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-c964-7471-000000001a16] 44071 1727204688.50738: sending task result for task 127b8e07-fff9-c964-7471-000000001a16 44071 1727204688.50840: done sending task result for task 127b8e07-fff9-c964-7471-000000001a16 44071 1727204688.50843: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204688.50893: no more pending results, returning what we have 44071 1727204688.50898: results queue empty 44071 1727204688.50899: checking for any_errors_fatal 44071 1727204688.50901: done checking for any_errors_fatal 44071 1727204688.50902: checking for max_fail_percentage 44071 1727204688.50904: done checking for max_fail_percentage 44071 1727204688.50905: checking to see if all hosts have failed and the running result is not ok 44071 1727204688.50905: done checking to see if all hosts have failed 44071 1727204688.50906: getting the remaining hosts for this loop 44071 1727204688.50908: done getting the remaining hosts for this loop 44071 1727204688.50913: getting the next task for host managed-node2 44071 1727204688.50926: done getting next task for host managed-node2 44071 1727204688.50930: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204688.50939: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204688.50969: getting variables 44071 1727204688.50971: in VariableManager get_vars() 44071 1727204688.51017: Calling all_inventory to load vars for managed-node2 44071 1727204688.51020: Calling groups_inventory to load vars for managed-node2 44071 1727204688.51022: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204688.51035: Calling all_plugins_play to load vars for managed-node2 44071 1727204688.51038: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204688.51047: Calling groups_plugins_play to load vars for managed-node2 44071 1727204688.52285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204688.53539: done with get_vars() 44071 1727204688.53575: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:04:48 -0400 (0:00:00.078) 0:01:40.853 ***** 44071 1727204688.53662: entering _queue_task() for managed-node2/stat 44071 1727204688.53972: worker is 1 (out of 1 available) 44071 1727204688.53989: exiting _queue_task() for managed-node2/stat 44071 1727204688.54004: done queuing things up, now waiting for results queue to drain 44071 1727204688.54006: waiting for pending results... 44071 1727204688.54230: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204688.54354: in run() - task 127b8e07-fff9-c964-7471-000000001a18 44071 1727204688.54371: variable 'ansible_search_path' from source: unknown 44071 1727204688.54375: variable 'ansible_search_path' from source: unknown 44071 1727204688.54411: calling self._execute() 44071 1727204688.54516: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204688.54520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204688.54530: variable 'omit' from source: magic vars 44071 1727204688.54862: variable 'ansible_distribution_major_version' from source: facts 44071 1727204688.54875: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204688.55014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204688.55382: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204688.55386: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204688.55392: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204688.55437: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204688.55553: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204688.55605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204688.55643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204688.55681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204688.55824: variable '__network_is_ostree' from source: set_fact 44071 1727204688.55841: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204688.55851: when evaluation is False, skipping this task 44071 1727204688.55859: _execute() done 44071 1727204688.55871: dumping result to json 44071 1727204688.55881: done dumping result, returning 44071 1727204688.55895: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-c964-7471-000000001a18] 44071 1727204688.55906: sending task result for task 127b8e07-fff9-c964-7471-000000001a18 44071 1727204688.56171: done sending task result for task 127b8e07-fff9-c964-7471-000000001a18 44071 1727204688.56175: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204688.56243: no more pending results, returning what we have 44071 1727204688.56248: results queue empty 44071 1727204688.56250: checking for any_errors_fatal 44071 1727204688.56260: done checking for any_errors_fatal 44071 1727204688.56261: checking for max_fail_percentage 44071 1727204688.56263: done checking for max_fail_percentage 44071 1727204688.56264: checking to see if all hosts have failed and the running result is not ok 44071 1727204688.56267: done checking to see if all hosts have failed 44071 1727204688.56268: getting the remaining hosts for this loop 44071 1727204688.56270: done getting the remaining hosts for this loop 44071 1727204688.56277: getting the next task for host managed-node2 44071 1727204688.56291: done getting next task for host managed-node2 44071 1727204688.56296: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204688.56304: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204688.56334: getting variables 44071 1727204688.56337: in VariableManager get_vars() 44071 1727204688.56519: Calling all_inventory to load vars for managed-node2 44071 1727204688.56522: Calling groups_inventory to load vars for managed-node2 44071 1727204688.56525: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204688.56538: Calling all_plugins_play to load vars for managed-node2 44071 1727204688.56542: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204688.56547: Calling groups_plugins_play to load vars for managed-node2 44071 1727204688.58839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204688.61403: done with get_vars() 44071 1727204688.61437: done getting variables 44071 1727204688.61494: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:04:48 -0400 (0:00:00.078) 0:01:40.931 ***** 44071 1727204688.61526: entering _queue_task() for managed-node2/set_fact 44071 1727204688.61829: worker is 1 (out of 1 available) 44071 1727204688.61844: exiting _queue_task() for managed-node2/set_fact 44071 1727204688.61860: done queuing things up, now waiting for results queue to drain 44071 1727204688.61861: waiting for pending results... 44071 1727204688.62084: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204688.62217: in run() - task 127b8e07-fff9-c964-7471-000000001a19 44071 1727204688.62229: variable 'ansible_search_path' from source: unknown 44071 1727204688.62233: variable 'ansible_search_path' from source: unknown 44071 1727204688.62271: calling self._execute() 44071 1727204688.62361: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204688.62367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204688.62377: variable 'omit' from source: magic vars 44071 1727204688.62704: variable 'ansible_distribution_major_version' from source: facts 44071 1727204688.62715: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204688.62857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204688.63086: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204688.63124: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204688.63171: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204688.63370: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204688.63374: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204688.63377: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204688.63432: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204688.63467: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204688.63773: variable '__network_is_ostree' from source: set_fact 44071 1727204688.63776: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204688.63778: when evaluation is False, skipping this task 44071 1727204688.63780: _execute() done 44071 1727204688.63783: dumping result to json 44071 1727204688.63785: done dumping result, returning 44071 1727204688.63788: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-000000001a19] 44071 1727204688.63789: sending task result for task 127b8e07-fff9-c964-7471-000000001a19 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204688.64220: no more pending results, returning what we have 44071 1727204688.64228: results queue empty 44071 1727204688.64229: checking for any_errors_fatal 44071 1727204688.64238: done checking for any_errors_fatal 44071 1727204688.64239: checking for max_fail_percentage 44071 1727204688.64241: done checking for max_fail_percentage 44071 1727204688.64242: checking to see if all hosts have failed and the running result is not ok 44071 1727204688.64243: done checking to see if all hosts have failed 44071 1727204688.64244: getting the remaining hosts for this loop 44071 1727204688.64246: done getting the remaining hosts for this loop 44071 1727204688.64252: getting the next task for host managed-node2 44071 1727204688.64268: done getting next task for host managed-node2 44071 1727204688.64273: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204688.64281: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204688.64312: getting variables 44071 1727204688.64314: in VariableManager get_vars() 44071 1727204688.64619: Calling all_inventory to load vars for managed-node2 44071 1727204688.64623: Calling groups_inventory to load vars for managed-node2 44071 1727204688.64626: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204688.64639: Calling all_plugins_play to load vars for managed-node2 44071 1727204688.64642: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204688.64645: Calling groups_plugins_play to load vars for managed-node2 44071 1727204688.65478: done sending task result for task 127b8e07-fff9-c964-7471-000000001a19 44071 1727204688.65482: WORKER PROCESS EXITING 44071 1727204688.66832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204688.68829: done with get_vars() 44071 1727204688.68867: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:04:48 -0400 (0:00:00.074) 0:01:41.005 ***** 44071 1727204688.68952: entering _queue_task() for managed-node2/service_facts 44071 1727204688.69267: worker is 1 (out of 1 available) 44071 1727204688.69284: exiting _queue_task() for managed-node2/service_facts 44071 1727204688.69299: done queuing things up, now waiting for results queue to drain 44071 1727204688.69301: waiting for pending results... 44071 1727204688.69527: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204688.69645: in run() - task 127b8e07-fff9-c964-7471-000000001a1b 44071 1727204688.69657: variable 'ansible_search_path' from source: unknown 44071 1727204688.69661: variable 'ansible_search_path' from source: unknown 44071 1727204688.69702: calling self._execute() 44071 1727204688.69796: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204688.69801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204688.69812: variable 'omit' from source: magic vars 44071 1727204688.70173: variable 'ansible_distribution_major_version' from source: facts 44071 1727204688.70183: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204688.70207: variable 'omit' from source: magic vars 44071 1727204688.70380: variable 'omit' from source: magic vars 44071 1727204688.70384: variable 'omit' from source: magic vars 44071 1727204688.70387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204688.70452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204688.70455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204688.70470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204688.70485: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204688.70628: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204688.70631: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204688.70634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204688.70636: Set connection var ansible_connection to ssh 44071 1727204688.70647: Set connection var ansible_timeout to 10 44071 1727204688.70654: Set connection var ansible_pipelining to False 44071 1727204688.70660: Set connection var ansible_shell_type to sh 44071 1727204688.70668: Set connection var ansible_shell_executable to /bin/sh 44071 1727204688.70676: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204688.70705: variable 'ansible_shell_executable' from source: unknown 44071 1727204688.70708: variable 'ansible_connection' from source: unknown 44071 1727204688.70711: variable 'ansible_module_compression' from source: unknown 44071 1727204688.70714: variable 'ansible_shell_type' from source: unknown 44071 1727204688.70717: variable 'ansible_shell_executable' from source: unknown 44071 1727204688.70719: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204688.70721: variable 'ansible_pipelining' from source: unknown 44071 1727204688.70732: variable 'ansible_timeout' from source: unknown 44071 1727204688.70739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204688.71073: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204688.71080: variable 'omit' from source: magic vars 44071 1727204688.71083: starting attempt loop 44071 1727204688.71085: running the handler 44071 1727204688.71088: _low_level_execute_command(): starting 44071 1727204688.71090: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204688.71762: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204688.71776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204688.71793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204688.71804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204688.71830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204688.71836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204688.71840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204688.71906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204688.71913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204688.71915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204688.71994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204688.73763: stdout chunk (state=3): >>>/root <<< 44071 1727204688.73874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204688.73971: stderr chunk (state=3): >>><<< 44071 1727204688.73975: stdout chunk (state=3): >>><<< 44071 1727204688.73990: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204688.74031: _low_level_execute_command(): starting 44071 1727204688.74037: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204688.739868-50037-235743758757950 `" && echo ansible-tmp-1727204688.739868-50037-235743758757950="` echo /root/.ansible/tmp/ansible-tmp-1727204688.739868-50037-235743758757950 `" ) && sleep 0' 44071 1727204688.74753: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204688.74870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204688.74875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204688.74887: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204688.74890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204688.74893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204688.74895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204688.74922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204688.74987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204688.75070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204688.77061: stdout chunk (state=3): >>>ansible-tmp-1727204688.739868-50037-235743758757950=/root/.ansible/tmp/ansible-tmp-1727204688.739868-50037-235743758757950 <<< 44071 1727204688.77194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204688.77247: stderr chunk (state=3): >>><<< 44071 1727204688.77250: stdout chunk (state=3): >>><<< 44071 1727204688.77280: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204688.739868-50037-235743758757950=/root/.ansible/tmp/ansible-tmp-1727204688.739868-50037-235743758757950 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204688.77343: variable 'ansible_module_compression' from source: unknown 44071 1727204688.77395: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44071 1727204688.77462: variable 'ansible_facts' from source: unknown 44071 1727204688.77521: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204688.739868-50037-235743758757950/AnsiballZ_service_facts.py 44071 1727204688.77653: Sending initial data 44071 1727204688.77656: Sent initial data (161 bytes) 44071 1727204688.78153: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204688.78158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204688.78189: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204688.78194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204688.78257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204688.78261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204688.78263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204688.78346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204688.79957: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204688.80025: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204688.80096: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpf73n3zu6 /root/.ansible/tmp/ansible-tmp-1727204688.739868-50037-235743758757950/AnsiballZ_service_facts.py <<< 44071 1727204688.80103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204688.739868-50037-235743758757950/AnsiballZ_service_facts.py" <<< 44071 1727204688.80168: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpf73n3zu6" to remote "/root/.ansible/tmp/ansible-tmp-1727204688.739868-50037-235743758757950/AnsiballZ_service_facts.py" <<< 44071 1727204688.80171: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204688.739868-50037-235743758757950/AnsiballZ_service_facts.py" <<< 44071 1727204688.80996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204688.81002: stderr chunk (state=3): >>><<< 44071 1727204688.81006: stdout chunk (state=3): >>><<< 44071 1727204688.81035: done transferring module to remote 44071 1727204688.81046: _low_level_execute_command(): starting 44071 1727204688.81052: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204688.739868-50037-235743758757950/ /root/.ansible/tmp/ansible-tmp-1727204688.739868-50037-235743758757950/AnsiballZ_service_facts.py && sleep 0' 44071 1727204688.82237: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204688.82241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204688.82420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204688.82476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204688.82557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204688.84412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204688.84476: stderr chunk (state=3): >>><<< 44071 1727204688.84480: stdout chunk (state=3): >>><<< 44071 1727204688.84495: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204688.84499: _low_level_execute_command(): starting 44071 1727204688.84502: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204688.739868-50037-235743758757950/AnsiballZ_service_facts.py && sleep 0' 44071 1727204688.85022: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204688.85026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204688.85029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204688.85031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204688.85092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204688.85096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204688.85177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204691.06323: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedes<<< 44071 1727204691.06339: stdout chunk (state=3): >>>ktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44071 1727204691.07958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204691.08030: stderr chunk (state=3): >>><<< 44071 1727204691.08037: stdout chunk (state=3): >>><<< 44071 1727204691.08058: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204691.08689: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204688.739868-50037-235743758757950/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204691.08695: _low_level_execute_command(): starting 44071 1727204691.08700: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204688.739868-50037-235743758757950/ > /dev/null 2>&1 && sleep 0' 44071 1727204691.09345: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204691.09349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204691.09368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204691.09421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204691.09521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204691.11423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204691.11484: stderr chunk (state=3): >>><<< 44071 1727204691.11490: stdout chunk (state=3): >>><<< 44071 1727204691.11505: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204691.11513: handler run complete 44071 1727204691.11667: variable 'ansible_facts' from source: unknown 44071 1727204691.11961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204691.12317: variable 'ansible_facts' from source: unknown 44071 1727204691.12426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204691.12589: attempt loop complete, returning result 44071 1727204691.12594: _execute() done 44071 1727204691.12597: dumping result to json 44071 1727204691.12643: done dumping result, returning 44071 1727204691.12652: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-c964-7471-000000001a1b] 44071 1727204691.12656: sending task result for task 127b8e07-fff9-c964-7471-000000001a1b 44071 1727204691.14429: done sending task result for task 127b8e07-fff9-c964-7471-000000001a1b 44071 1727204691.14437: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204691.14564: no more pending results, returning what we have 44071 1727204691.14570: results queue empty 44071 1727204691.14571: checking for any_errors_fatal 44071 1727204691.14576: done checking for any_errors_fatal 44071 1727204691.14577: checking for max_fail_percentage 44071 1727204691.14578: done checking for max_fail_percentage 44071 1727204691.14579: checking to see if all hosts have failed and the running result is not ok 44071 1727204691.14586: done checking to see if all hosts have failed 44071 1727204691.14587: getting the remaining hosts for this loop 44071 1727204691.14588: done getting the remaining hosts for this loop 44071 1727204691.14592: getting the next task for host managed-node2 44071 1727204691.14599: done getting next task for host managed-node2 44071 1727204691.14602: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204691.14608: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204691.14621: getting variables 44071 1727204691.14622: in VariableManager get_vars() 44071 1727204691.14658: Calling all_inventory to load vars for managed-node2 44071 1727204691.14661: Calling groups_inventory to load vars for managed-node2 44071 1727204691.14663: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204691.14674: Calling all_plugins_play to load vars for managed-node2 44071 1727204691.14677: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204691.14679: Calling groups_plugins_play to load vars for managed-node2 44071 1727204691.16627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204691.19059: done with get_vars() 44071 1727204691.19112: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:04:51 -0400 (0:00:02.502) 0:01:43.508 ***** 44071 1727204691.19252: entering _queue_task() for managed-node2/package_facts 44071 1727204691.19801: worker is 1 (out of 1 available) 44071 1727204691.19920: exiting _queue_task() for managed-node2/package_facts 44071 1727204691.19939: done queuing things up, now waiting for results queue to drain 44071 1727204691.19941: waiting for pending results... 44071 1727204691.20158: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204691.20374: in run() - task 127b8e07-fff9-c964-7471-000000001a1c 44071 1727204691.20399: variable 'ansible_search_path' from source: unknown 44071 1727204691.20409: variable 'ansible_search_path' from source: unknown 44071 1727204691.20461: calling self._execute() 44071 1727204691.20588: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204691.20612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204691.20627: variable 'omit' from source: magic vars 44071 1727204691.21123: variable 'ansible_distribution_major_version' from source: facts 44071 1727204691.21262: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204691.21268: variable 'omit' from source: magic vars 44071 1727204691.21284: variable 'omit' from source: magic vars 44071 1727204691.21329: variable 'omit' from source: magic vars 44071 1727204691.21399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204691.21447: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204691.21490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204691.21515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204691.21536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204691.21581: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204691.21599: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204691.21608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204691.21741: Set connection var ansible_connection to ssh 44071 1727204691.21754: Set connection var ansible_timeout to 10 44071 1727204691.21764: Set connection var ansible_pipelining to False 44071 1727204691.21777: Set connection var ansible_shell_type to sh 44071 1727204691.21788: Set connection var ansible_shell_executable to /bin/sh 44071 1727204691.21810: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204691.21849: variable 'ansible_shell_executable' from source: unknown 44071 1727204691.21870: variable 'ansible_connection' from source: unknown 44071 1727204691.21874: variable 'ansible_module_compression' from source: unknown 44071 1727204691.21876: variable 'ansible_shell_type' from source: unknown 44071 1727204691.21878: variable 'ansible_shell_executable' from source: unknown 44071 1727204691.21909: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204691.21913: variable 'ansible_pipelining' from source: unknown 44071 1727204691.21921: variable 'ansible_timeout' from source: unknown 44071 1727204691.21923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204691.22188: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204691.22203: variable 'omit' from source: magic vars 44071 1727204691.22207: starting attempt loop 44071 1727204691.22210: running the handler 44071 1727204691.22224: _low_level_execute_command(): starting 44071 1727204691.22231: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204691.22820: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204691.22826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204691.22830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204691.22883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204691.22891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204691.22893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204691.22964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204691.24637: stdout chunk (state=3): >>>/root <<< 44071 1727204691.24728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204691.24879: stderr chunk (state=3): >>><<< 44071 1727204691.24884: stdout chunk (state=3): >>><<< 44071 1727204691.24888: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204691.24892: _low_level_execute_command(): starting 44071 1727204691.24902: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204691.2487638-50131-75012780388538 `" && echo ansible-tmp-1727204691.2487638-50131-75012780388538="` echo /root/.ansible/tmp/ansible-tmp-1727204691.2487638-50131-75012780388538 `" ) && sleep 0' 44071 1727204691.25664: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204691.25673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204691.25685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204691.25820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204691.25838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204691.25901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204691.27900: stdout chunk (state=3): >>>ansible-tmp-1727204691.2487638-50131-75012780388538=/root/.ansible/tmp/ansible-tmp-1727204691.2487638-50131-75012780388538 <<< 44071 1727204691.28003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204691.28067: stderr chunk (state=3): >>><<< 44071 1727204691.28073: stdout chunk (state=3): >>><<< 44071 1727204691.28089: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204691.2487638-50131-75012780388538=/root/.ansible/tmp/ansible-tmp-1727204691.2487638-50131-75012780388538 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204691.28134: variable 'ansible_module_compression' from source: unknown 44071 1727204691.28181: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44071 1727204691.28242: variable 'ansible_facts' from source: unknown 44071 1727204691.28369: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204691.2487638-50131-75012780388538/AnsiballZ_package_facts.py 44071 1727204691.28501: Sending initial data 44071 1727204691.28504: Sent initial data (161 bytes) 44071 1727204691.29127: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204691.29158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204691.29262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204691.30886: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204691.30961: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204691.31045: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmprzkp7z5q /root/.ansible/tmp/ansible-tmp-1727204691.2487638-50131-75012780388538/AnsiballZ_package_facts.py <<< 44071 1727204691.31049: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204691.2487638-50131-75012780388538/AnsiballZ_package_facts.py" <<< 44071 1727204691.31130: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmprzkp7z5q" to remote "/root/.ansible/tmp/ansible-tmp-1727204691.2487638-50131-75012780388538/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204691.2487638-50131-75012780388538/AnsiballZ_package_facts.py" <<< 44071 1727204691.32874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204691.32960: stderr chunk (state=3): >>><<< 44071 1727204691.32964: stdout chunk (state=3): >>><<< 44071 1727204691.33071: done transferring module to remote 44071 1727204691.33074: _low_level_execute_command(): starting 44071 1727204691.33077: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204691.2487638-50131-75012780388538/ /root/.ansible/tmp/ansible-tmp-1727204691.2487638-50131-75012780388538/AnsiballZ_package_facts.py && sleep 0' 44071 1727204691.33741: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204691.33748: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204691.33790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204691.33803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204691.33896: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204691.33944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204691.34027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204691.36028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204691.36032: stdout chunk (state=3): >>><<< 44071 1727204691.36035: stderr chunk (state=3): >>><<< 44071 1727204691.36141: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204691.36147: _low_level_execute_command(): starting 44071 1727204691.36149: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204691.2487638-50131-75012780388538/AnsiballZ_package_facts.py && sleep 0' 44071 1727204691.36827: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204691.36832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204691.36915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204691.99695: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 44071 1727204691.99723: stdout chunk (state=3): >>>me": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 44071 1727204691.99728: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40",<<< 44071 1727204691.99755: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-l<<< 44071 1727204691.99769: stdout chunk (state=3): >>>ibs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lib<<< 44071 1727204691.99801: stdout chunk (state=3): >>>xmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1"<<< 44071 1727204691.99816: stdout chunk (state=3): >>>, "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_<<< 44071 1727204691.99842: stdout chunk (state=3): >>>64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "<<< 44071 1727204691.99848: stdout chunk (state=3): >>>rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarc<<< 44071 1727204691.99877: stdout chunk (state=3): >>>h", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}]<<< 44071 1727204691.99888: stdout chunk (state=3): >>>, "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50<<< 44071 1727204691.99920: stdout chunk (state=3): >>>, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "s<<< 44071 1727204691.99941: stdout chunk (state=3): >>>ource": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44071 1727204692.01749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204692.01818: stderr chunk (state=3): >>><<< 44071 1727204692.01822: stdout chunk (state=3): >>><<< 44071 1727204692.01871: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204692.03765: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204691.2487638-50131-75012780388538/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204692.03792: _low_level_execute_command(): starting 44071 1727204692.03796: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204691.2487638-50131-75012780388538/ > /dev/null 2>&1 && sleep 0' 44071 1727204692.04294: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204692.04299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204692.04302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204692.04358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204692.04362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204692.04445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204692.06378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204692.06437: stderr chunk (state=3): >>><<< 44071 1727204692.06440: stdout chunk (state=3): >>><<< 44071 1727204692.06456: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204692.06463: handler run complete 44071 1727204692.07137: variable 'ansible_facts' from source: unknown 44071 1727204692.14296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204692.15836: variable 'ansible_facts' from source: unknown 44071 1727204692.16192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204692.16770: attempt loop complete, returning result 44071 1727204692.16786: _execute() done 44071 1727204692.16789: dumping result to json 44071 1727204692.16952: done dumping result, returning 44071 1727204692.16962: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-c964-7471-000000001a1c] 44071 1727204692.16965: sending task result for task 127b8e07-fff9-c964-7471-000000001a1c 44071 1727204692.24638: done sending task result for task 127b8e07-fff9-c964-7471-000000001a1c 44071 1727204692.24642: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204692.24703: no more pending results, returning what we have 44071 1727204692.24707: results queue empty 44071 1727204692.24708: checking for any_errors_fatal 44071 1727204692.24713: done checking for any_errors_fatal 44071 1727204692.24714: checking for max_fail_percentage 44071 1727204692.24714: done checking for max_fail_percentage 44071 1727204692.24715: checking to see if all hosts have failed and the running result is not ok 44071 1727204692.24716: done checking to see if all hosts have failed 44071 1727204692.24716: getting the remaining hosts for this loop 44071 1727204692.24717: done getting the remaining hosts for this loop 44071 1727204692.24720: getting the next task for host managed-node2 44071 1727204692.24725: done getting next task for host managed-node2 44071 1727204692.24727: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204692.24735: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204692.24744: getting variables 44071 1727204692.24745: in VariableManager get_vars() 44071 1727204692.24762: Calling all_inventory to load vars for managed-node2 44071 1727204692.24764: Calling groups_inventory to load vars for managed-node2 44071 1727204692.24767: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204692.24772: Calling all_plugins_play to load vars for managed-node2 44071 1727204692.24774: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204692.24776: Calling groups_plugins_play to load vars for managed-node2 44071 1727204692.25704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204692.26981: done with get_vars() 44071 1727204692.27015: done getting variables 44071 1727204692.27060: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:04:52 -0400 (0:00:01.078) 0:01:44.587 ***** 44071 1727204692.27094: entering _queue_task() for managed-node2/debug 44071 1727204692.27415: worker is 1 (out of 1 available) 44071 1727204692.27435: exiting _queue_task() for managed-node2/debug 44071 1727204692.27449: done queuing things up, now waiting for results queue to drain 44071 1727204692.27451: waiting for pending results... 44071 1727204692.27669: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204692.27787: in run() - task 127b8e07-fff9-c964-7471-0000000019c0 44071 1727204692.27806: variable 'ansible_search_path' from source: unknown 44071 1727204692.27809: variable 'ansible_search_path' from source: unknown 44071 1727204692.27843: calling self._execute() 44071 1727204692.27936: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204692.27940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204692.27948: variable 'omit' from source: magic vars 44071 1727204692.28301: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.28313: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204692.28319: variable 'omit' from source: magic vars 44071 1727204692.28374: variable 'omit' from source: magic vars 44071 1727204692.28464: variable 'network_provider' from source: set_fact 44071 1727204692.28481: variable 'omit' from source: magic vars 44071 1727204692.28518: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204692.28550: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204692.28577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204692.28592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204692.28603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204692.28628: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204692.28631: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204692.28637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204692.28720: Set connection var ansible_connection to ssh 44071 1727204692.28726: Set connection var ansible_timeout to 10 44071 1727204692.28735: Set connection var ansible_pipelining to False 44071 1727204692.28739: Set connection var ansible_shell_type to sh 44071 1727204692.28745: Set connection var ansible_shell_executable to /bin/sh 44071 1727204692.28751: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204692.28775: variable 'ansible_shell_executable' from source: unknown 44071 1727204692.28780: variable 'ansible_connection' from source: unknown 44071 1727204692.28783: variable 'ansible_module_compression' from source: unknown 44071 1727204692.28785: variable 'ansible_shell_type' from source: unknown 44071 1727204692.28790: variable 'ansible_shell_executable' from source: unknown 44071 1727204692.28792: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204692.28795: variable 'ansible_pipelining' from source: unknown 44071 1727204692.28798: variable 'ansible_timeout' from source: unknown 44071 1727204692.28801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204692.28923: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204692.28927: variable 'omit' from source: magic vars 44071 1727204692.28935: starting attempt loop 44071 1727204692.28938: running the handler 44071 1727204692.28979: handler run complete 44071 1727204692.28993: attempt loop complete, returning result 44071 1727204692.28997: _execute() done 44071 1727204692.28999: dumping result to json 44071 1727204692.29002: done dumping result, returning 44071 1727204692.29010: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-c964-7471-0000000019c0] 44071 1727204692.29014: sending task result for task 127b8e07-fff9-c964-7471-0000000019c0 44071 1727204692.29115: done sending task result for task 127b8e07-fff9-c964-7471-0000000019c0 44071 1727204692.29118: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 44071 1727204692.29207: no more pending results, returning what we have 44071 1727204692.29211: results queue empty 44071 1727204692.29212: checking for any_errors_fatal 44071 1727204692.29228: done checking for any_errors_fatal 44071 1727204692.29228: checking for max_fail_percentage 44071 1727204692.29230: done checking for max_fail_percentage 44071 1727204692.29231: checking to see if all hosts have failed and the running result is not ok 44071 1727204692.29234: done checking to see if all hosts have failed 44071 1727204692.29235: getting the remaining hosts for this loop 44071 1727204692.29243: done getting the remaining hosts for this loop 44071 1727204692.29253: getting the next task for host managed-node2 44071 1727204692.29262: done getting next task for host managed-node2 44071 1727204692.29268: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204692.29273: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204692.29287: getting variables 44071 1727204692.29289: in VariableManager get_vars() 44071 1727204692.29330: Calling all_inventory to load vars for managed-node2 44071 1727204692.29335: Calling groups_inventory to load vars for managed-node2 44071 1727204692.29337: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204692.29353: Calling all_plugins_play to load vars for managed-node2 44071 1727204692.29360: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204692.29364: Calling groups_plugins_play to load vars for managed-node2 44071 1727204692.30559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204692.31852: done with get_vars() 44071 1727204692.31889: done getting variables 44071 1727204692.31942: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:04:52 -0400 (0:00:00.048) 0:01:44.636 ***** 44071 1727204692.31984: entering _queue_task() for managed-node2/fail 44071 1727204692.32299: worker is 1 (out of 1 available) 44071 1727204692.32314: exiting _queue_task() for managed-node2/fail 44071 1727204692.32330: done queuing things up, now waiting for results queue to drain 44071 1727204692.32334: waiting for pending results... 44071 1727204692.32559: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204692.32693: in run() - task 127b8e07-fff9-c964-7471-0000000019c1 44071 1727204692.32707: variable 'ansible_search_path' from source: unknown 44071 1727204692.32710: variable 'ansible_search_path' from source: unknown 44071 1727204692.32746: calling self._execute() 44071 1727204692.32842: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204692.32847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204692.32857: variable 'omit' from source: magic vars 44071 1727204692.33192: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.33204: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204692.33305: variable 'network_state' from source: role '' defaults 44071 1727204692.33314: Evaluated conditional (network_state != {}): False 44071 1727204692.33318: when evaluation is False, skipping this task 44071 1727204692.33321: _execute() done 44071 1727204692.33324: dumping result to json 44071 1727204692.33327: done dumping result, returning 44071 1727204692.33344: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-c964-7471-0000000019c1] 44071 1727204692.33347: sending task result for task 127b8e07-fff9-c964-7471-0000000019c1 44071 1727204692.33458: done sending task result for task 127b8e07-fff9-c964-7471-0000000019c1 44071 1727204692.33462: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204692.33515: no more pending results, returning what we have 44071 1727204692.33519: results queue empty 44071 1727204692.33520: checking for any_errors_fatal 44071 1727204692.33528: done checking for any_errors_fatal 44071 1727204692.33528: checking for max_fail_percentage 44071 1727204692.33530: done checking for max_fail_percentage 44071 1727204692.33531: checking to see if all hosts have failed and the running result is not ok 44071 1727204692.33534: done checking to see if all hosts have failed 44071 1727204692.33535: getting the remaining hosts for this loop 44071 1727204692.33538: done getting the remaining hosts for this loop 44071 1727204692.33543: getting the next task for host managed-node2 44071 1727204692.33554: done getting next task for host managed-node2 44071 1727204692.33559: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204692.33564: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204692.33600: getting variables 44071 1727204692.33601: in VariableManager get_vars() 44071 1727204692.33647: Calling all_inventory to load vars for managed-node2 44071 1727204692.33650: Calling groups_inventory to load vars for managed-node2 44071 1727204692.33652: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204692.33662: Calling all_plugins_play to load vars for managed-node2 44071 1727204692.33665: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204692.33669: Calling groups_plugins_play to load vars for managed-node2 44071 1727204692.34927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204692.36207: done with get_vars() 44071 1727204692.36241: done getting variables 44071 1727204692.36299: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:04:52 -0400 (0:00:00.043) 0:01:44.679 ***** 44071 1727204692.36329: entering _queue_task() for managed-node2/fail 44071 1727204692.36647: worker is 1 (out of 1 available) 44071 1727204692.36663: exiting _queue_task() for managed-node2/fail 44071 1727204692.36682: done queuing things up, now waiting for results queue to drain 44071 1727204692.36684: waiting for pending results... 44071 1727204692.36899: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204692.37026: in run() - task 127b8e07-fff9-c964-7471-0000000019c2 44071 1727204692.37043: variable 'ansible_search_path' from source: unknown 44071 1727204692.37047: variable 'ansible_search_path' from source: unknown 44071 1727204692.37081: calling self._execute() 44071 1727204692.37171: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204692.37178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204692.37187: variable 'omit' from source: magic vars 44071 1727204692.37523: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.37535: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204692.37633: variable 'network_state' from source: role '' defaults 44071 1727204692.37645: Evaluated conditional (network_state != {}): False 44071 1727204692.37648: when evaluation is False, skipping this task 44071 1727204692.37651: _execute() done 44071 1727204692.37653: dumping result to json 44071 1727204692.37656: done dumping result, returning 44071 1727204692.37668: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-c964-7471-0000000019c2] 44071 1727204692.37671: sending task result for task 127b8e07-fff9-c964-7471-0000000019c2 44071 1727204692.37777: done sending task result for task 127b8e07-fff9-c964-7471-0000000019c2 44071 1727204692.37781: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204692.37835: no more pending results, returning what we have 44071 1727204692.37839: results queue empty 44071 1727204692.37841: checking for any_errors_fatal 44071 1727204692.37848: done checking for any_errors_fatal 44071 1727204692.37849: checking for max_fail_percentage 44071 1727204692.37851: done checking for max_fail_percentage 44071 1727204692.37852: checking to see if all hosts have failed and the running result is not ok 44071 1727204692.37852: done checking to see if all hosts have failed 44071 1727204692.37853: getting the remaining hosts for this loop 44071 1727204692.37855: done getting the remaining hosts for this loop 44071 1727204692.37859: getting the next task for host managed-node2 44071 1727204692.37872: done getting next task for host managed-node2 44071 1727204692.37876: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204692.37884: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204692.37912: getting variables 44071 1727204692.37914: in VariableManager get_vars() 44071 1727204692.37961: Calling all_inventory to load vars for managed-node2 44071 1727204692.37964: Calling groups_inventory to load vars for managed-node2 44071 1727204692.37975: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204692.37988: Calling all_plugins_play to load vars for managed-node2 44071 1727204692.37991: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204692.37994: Calling groups_plugins_play to load vars for managed-node2 44071 1727204692.39102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204692.40374: done with get_vars() 44071 1727204692.40406: done getting variables 44071 1727204692.40460: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:04:52 -0400 (0:00:00.041) 0:01:44.721 ***** 44071 1727204692.40495: entering _queue_task() for managed-node2/fail 44071 1727204692.40810: worker is 1 (out of 1 available) 44071 1727204692.40827: exiting _queue_task() for managed-node2/fail 44071 1727204692.40844: done queuing things up, now waiting for results queue to drain 44071 1727204692.40846: waiting for pending results... 44071 1727204692.41061: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204692.41192: in run() - task 127b8e07-fff9-c964-7471-0000000019c3 44071 1727204692.41206: variable 'ansible_search_path' from source: unknown 44071 1727204692.41210: variable 'ansible_search_path' from source: unknown 44071 1727204692.41245: calling self._execute() 44071 1727204692.41337: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204692.41343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204692.41350: variable 'omit' from source: magic vars 44071 1727204692.41695: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.41707: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204692.41862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204692.44053: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204692.44107: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204692.44140: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204692.44172: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204692.44194: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204692.44270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204692.44294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204692.44313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.44347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204692.44361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204692.44446: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.44461: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44071 1727204692.44553: variable 'ansible_distribution' from source: facts 44071 1727204692.44559: variable '__network_rh_distros' from source: role '' defaults 44071 1727204692.44570: Evaluated conditional (ansible_distribution in __network_rh_distros): False 44071 1727204692.44574: when evaluation is False, skipping this task 44071 1727204692.44576: _execute() done 44071 1727204692.44580: dumping result to json 44071 1727204692.44583: done dumping result, returning 44071 1727204692.44675: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-c964-7471-0000000019c3] 44071 1727204692.44678: sending task result for task 127b8e07-fff9-c964-7471-0000000019c3 44071 1727204692.44759: done sending task result for task 127b8e07-fff9-c964-7471-0000000019c3 44071 1727204692.44762: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 44071 1727204692.44827: no more pending results, returning what we have 44071 1727204692.44831: results queue empty 44071 1727204692.44832: checking for any_errors_fatal 44071 1727204692.44838: done checking for any_errors_fatal 44071 1727204692.44839: checking for max_fail_percentage 44071 1727204692.44841: done checking for max_fail_percentage 44071 1727204692.44841: checking to see if all hosts have failed and the running result is not ok 44071 1727204692.44842: done checking to see if all hosts have failed 44071 1727204692.44843: getting the remaining hosts for this loop 44071 1727204692.44844: done getting the remaining hosts for this loop 44071 1727204692.44848: getting the next task for host managed-node2 44071 1727204692.44856: done getting next task for host managed-node2 44071 1727204692.44861: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204692.44867: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204692.44890: getting variables 44071 1727204692.44892: in VariableManager get_vars() 44071 1727204692.44932: Calling all_inventory to load vars for managed-node2 44071 1727204692.44934: Calling groups_inventory to load vars for managed-node2 44071 1727204692.44937: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204692.44947: Calling all_plugins_play to load vars for managed-node2 44071 1727204692.44949: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204692.44952: Calling groups_plugins_play to load vars for managed-node2 44071 1727204692.46204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204692.47468: done with get_vars() 44071 1727204692.47502: done getting variables 44071 1727204692.47556: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:04:52 -0400 (0:00:00.070) 0:01:44.792 ***** 44071 1727204692.47593: entering _queue_task() for managed-node2/dnf 44071 1727204692.47907: worker is 1 (out of 1 available) 44071 1727204692.47923: exiting _queue_task() for managed-node2/dnf 44071 1727204692.47939: done queuing things up, now waiting for results queue to drain 44071 1727204692.47942: waiting for pending results... 44071 1727204692.48167: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204692.48275: in run() - task 127b8e07-fff9-c964-7471-0000000019c4 44071 1727204692.48291: variable 'ansible_search_path' from source: unknown 44071 1727204692.48295: variable 'ansible_search_path' from source: unknown 44071 1727204692.48334: calling self._execute() 44071 1727204692.48430: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204692.48437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204692.48445: variable 'omit' from source: magic vars 44071 1727204692.48789: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.48800: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204692.48971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204692.50813: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204692.50872: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204692.50906: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204692.50939: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204692.50958: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204692.51030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204692.51066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204692.51086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.51119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204692.51137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204692.51229: variable 'ansible_distribution' from source: facts 44071 1727204692.51236: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.51246: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44071 1727204692.51340: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204692.51434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204692.51464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204692.51487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.51514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204692.51526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204692.51567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204692.51594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204692.51609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.51640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204692.51652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204692.51688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204692.51708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204692.51725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.51756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204692.51768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204692.51893: variable 'network_connections' from source: include params 44071 1727204692.51905: variable 'interface' from source: play vars 44071 1727204692.51962: variable 'interface' from source: play vars 44071 1727204692.52025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204692.52168: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204692.52201: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204692.52228: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204692.52252: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204692.52291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204692.52310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204692.52335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.52356: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204692.52399: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204692.52601: variable 'network_connections' from source: include params 44071 1727204692.52605: variable 'interface' from source: play vars 44071 1727204692.52656: variable 'interface' from source: play vars 44071 1727204692.52680: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204692.52683: when evaluation is False, skipping this task 44071 1727204692.52686: _execute() done 44071 1727204692.52691: dumping result to json 44071 1727204692.52693: done dumping result, returning 44071 1727204692.52702: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-0000000019c4] 44071 1727204692.52706: sending task result for task 127b8e07-fff9-c964-7471-0000000019c4 44071 1727204692.52815: done sending task result for task 127b8e07-fff9-c964-7471-0000000019c4 44071 1727204692.52818: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204692.52878: no more pending results, returning what we have 44071 1727204692.52882: results queue empty 44071 1727204692.52883: checking for any_errors_fatal 44071 1727204692.52891: done checking for any_errors_fatal 44071 1727204692.52892: checking for max_fail_percentage 44071 1727204692.52894: done checking for max_fail_percentage 44071 1727204692.52895: checking to see if all hosts have failed and the running result is not ok 44071 1727204692.52896: done checking to see if all hosts have failed 44071 1727204692.52896: getting the remaining hosts for this loop 44071 1727204692.52898: done getting the remaining hosts for this loop 44071 1727204692.52903: getting the next task for host managed-node2 44071 1727204692.52911: done getting next task for host managed-node2 44071 1727204692.52915: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204692.52920: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204692.52949: getting variables 44071 1727204692.52951: in VariableManager get_vars() 44071 1727204692.52996: Calling all_inventory to load vars for managed-node2 44071 1727204692.52999: Calling groups_inventory to load vars for managed-node2 44071 1727204692.53001: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204692.53011: Calling all_plugins_play to load vars for managed-node2 44071 1727204692.53014: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204692.53017: Calling groups_plugins_play to load vars for managed-node2 44071 1727204692.54251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204692.55510: done with get_vars() 44071 1727204692.55546: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204692.55615: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:04:52 -0400 (0:00:00.080) 0:01:44.872 ***** 44071 1727204692.55644: entering _queue_task() for managed-node2/yum 44071 1727204692.55963: worker is 1 (out of 1 available) 44071 1727204692.55984: exiting _queue_task() for managed-node2/yum 44071 1727204692.55999: done queuing things up, now waiting for results queue to drain 44071 1727204692.56001: waiting for pending results... 44071 1727204692.56219: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204692.56344: in run() - task 127b8e07-fff9-c964-7471-0000000019c5 44071 1727204692.56363: variable 'ansible_search_path' from source: unknown 44071 1727204692.56369: variable 'ansible_search_path' from source: unknown 44071 1727204692.56404: calling self._execute() 44071 1727204692.56498: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204692.56503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204692.56512: variable 'omit' from source: magic vars 44071 1727204692.56848: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.56860: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204692.57008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204692.58822: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204692.58883: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204692.58911: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204692.58937: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204692.58972: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204692.59036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204692.59071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204692.59095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.59124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204692.59138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204692.59221: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.59237: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44071 1727204692.59241: when evaluation is False, skipping this task 44071 1727204692.59244: _execute() done 44071 1727204692.59247: dumping result to json 44071 1727204692.59249: done dumping result, returning 44071 1727204692.59258: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-0000000019c5] 44071 1727204692.59261: sending task result for task 127b8e07-fff9-c964-7471-0000000019c5 44071 1727204692.59375: done sending task result for task 127b8e07-fff9-c964-7471-0000000019c5 44071 1727204692.59380: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44071 1727204692.59444: no more pending results, returning what we have 44071 1727204692.59448: results queue empty 44071 1727204692.59449: checking for any_errors_fatal 44071 1727204692.59458: done checking for any_errors_fatal 44071 1727204692.59459: checking for max_fail_percentage 44071 1727204692.59461: done checking for max_fail_percentage 44071 1727204692.59462: checking to see if all hosts have failed and the running result is not ok 44071 1727204692.59463: done checking to see if all hosts have failed 44071 1727204692.59463: getting the remaining hosts for this loop 44071 1727204692.59466: done getting the remaining hosts for this loop 44071 1727204692.59472: getting the next task for host managed-node2 44071 1727204692.59481: done getting next task for host managed-node2 44071 1727204692.59485: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204692.59492: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204692.59518: getting variables 44071 1727204692.59520: in VariableManager get_vars() 44071 1727204692.59575: Calling all_inventory to load vars for managed-node2 44071 1727204692.59579: Calling groups_inventory to load vars for managed-node2 44071 1727204692.59581: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204692.59592: Calling all_plugins_play to load vars for managed-node2 44071 1727204692.59595: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204692.59598: Calling groups_plugins_play to load vars for managed-node2 44071 1727204692.60710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204692.61953: done with get_vars() 44071 1727204692.61992: done getting variables 44071 1727204692.62045: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:04:52 -0400 (0:00:00.064) 0:01:44.937 ***** 44071 1727204692.62082: entering _queue_task() for managed-node2/fail 44071 1727204692.62394: worker is 1 (out of 1 available) 44071 1727204692.62409: exiting _queue_task() for managed-node2/fail 44071 1727204692.62426: done queuing things up, now waiting for results queue to drain 44071 1727204692.62427: waiting for pending results... 44071 1727204692.62653: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204692.62782: in run() - task 127b8e07-fff9-c964-7471-0000000019c6 44071 1727204692.62794: variable 'ansible_search_path' from source: unknown 44071 1727204692.62798: variable 'ansible_search_path' from source: unknown 44071 1727204692.62833: calling self._execute() 44071 1727204692.62928: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204692.62933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204692.62944: variable 'omit' from source: magic vars 44071 1727204692.63286: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.63297: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204692.63398: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204692.63558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204692.65760: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204692.65815: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204692.65852: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204692.65880: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204692.65902: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204692.65977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204692.65999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204692.66019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.66055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204692.66069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204692.66109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204692.66127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204692.66151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.66183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204692.66194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204692.66226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204692.66246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204692.66267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.66298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204692.66309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204692.66448: variable 'network_connections' from source: include params 44071 1727204692.66461: variable 'interface' from source: play vars 44071 1727204692.66525: variable 'interface' from source: play vars 44071 1727204692.66587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204692.66744: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204692.66777: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204692.66802: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204692.66831: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204692.66872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204692.66890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204692.66910: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.66933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204692.66981: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204692.67167: variable 'network_connections' from source: include params 44071 1727204692.67173: variable 'interface' from source: play vars 44071 1727204692.67229: variable 'interface' from source: play vars 44071 1727204692.67251: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204692.67255: when evaluation is False, skipping this task 44071 1727204692.67258: _execute() done 44071 1727204692.67261: dumping result to json 44071 1727204692.67264: done dumping result, returning 44071 1727204692.67274: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-0000000019c6] 44071 1727204692.67279: sending task result for task 127b8e07-fff9-c964-7471-0000000019c6 44071 1727204692.67386: done sending task result for task 127b8e07-fff9-c964-7471-0000000019c6 44071 1727204692.67389: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204692.67456: no more pending results, returning what we have 44071 1727204692.67460: results queue empty 44071 1727204692.67461: checking for any_errors_fatal 44071 1727204692.67473: done checking for any_errors_fatal 44071 1727204692.67474: checking for max_fail_percentage 44071 1727204692.67475: done checking for max_fail_percentage 44071 1727204692.67476: checking to see if all hosts have failed and the running result is not ok 44071 1727204692.67477: done checking to see if all hosts have failed 44071 1727204692.67478: getting the remaining hosts for this loop 44071 1727204692.67480: done getting the remaining hosts for this loop 44071 1727204692.67485: getting the next task for host managed-node2 44071 1727204692.67494: done getting next task for host managed-node2 44071 1727204692.67504: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44071 1727204692.67512: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204692.67539: getting variables 44071 1727204692.67540: in VariableManager get_vars() 44071 1727204692.67589: Calling all_inventory to load vars for managed-node2 44071 1727204692.67592: Calling groups_inventory to load vars for managed-node2 44071 1727204692.67594: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204692.67604: Calling all_plugins_play to load vars for managed-node2 44071 1727204692.67612: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204692.67617: Calling groups_plugins_play to load vars for managed-node2 44071 1727204692.68912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204692.70197: done with get_vars() 44071 1727204692.70235: done getting variables 44071 1727204692.70291: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:04:52 -0400 (0:00:00.082) 0:01:45.019 ***** 44071 1727204692.70323: entering _queue_task() for managed-node2/package 44071 1727204692.70643: worker is 1 (out of 1 available) 44071 1727204692.70659: exiting _queue_task() for managed-node2/package 44071 1727204692.70674: done queuing things up, now waiting for results queue to drain 44071 1727204692.70677: waiting for pending results... 44071 1727204692.70894: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 44071 1727204692.71025: in run() - task 127b8e07-fff9-c964-7471-0000000019c7 44071 1727204692.71042: variable 'ansible_search_path' from source: unknown 44071 1727204692.71046: variable 'ansible_search_path' from source: unknown 44071 1727204692.71080: calling self._execute() 44071 1727204692.71173: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204692.71179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204692.71188: variable 'omit' from source: magic vars 44071 1727204692.71523: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.71538: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204692.71739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204692.72173: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204692.72178: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204692.72181: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204692.72256: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204692.72418: variable 'network_packages' from source: role '' defaults 44071 1727204692.72568: variable '__network_provider_setup' from source: role '' defaults 44071 1727204692.72589: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204692.72677: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204692.72694: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204692.72776: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204692.73060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204692.75620: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204692.75709: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204692.75759: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204692.75809: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204692.75883: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204692.75949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204692.75996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204692.76034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.76089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204692.76170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204692.76176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204692.76213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204692.76246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.76299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204692.76326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204692.76611: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204692.76757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204692.76791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204692.76825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.76948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204692.76953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204692.77083: variable 'ansible_python' from source: facts 44071 1727204692.77118: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204692.77228: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204692.77325: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204692.77475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204692.77548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204692.77695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.77699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204692.77701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204692.77704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204692.77735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204692.77763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.77807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204692.77831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204692.78015: variable 'network_connections' from source: include params 44071 1727204692.78022: variable 'interface' from source: play vars 44071 1727204692.78160: variable 'interface' from source: play vars 44071 1727204692.78281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204692.78311: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204692.78366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204692.78390: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204692.78445: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204692.78871: variable 'network_connections' from source: include params 44071 1727204692.78875: variable 'interface' from source: play vars 44071 1727204692.78975: variable 'interface' from source: play vars 44071 1727204692.79015: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204692.79114: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204692.79523: variable 'network_connections' from source: include params 44071 1727204692.79526: variable 'interface' from source: play vars 44071 1727204692.79612: variable 'interface' from source: play vars 44071 1727204692.79632: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204692.79700: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204692.79939: variable 'network_connections' from source: include params 44071 1727204692.79942: variable 'interface' from source: play vars 44071 1727204692.79995: variable 'interface' from source: play vars 44071 1727204692.80039: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204692.80084: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204692.80090: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204692.80139: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204692.80293: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204692.80639: variable 'network_connections' from source: include params 44071 1727204692.80644: variable 'interface' from source: play vars 44071 1727204692.80692: variable 'interface' from source: play vars 44071 1727204692.80699: variable 'ansible_distribution' from source: facts 44071 1727204692.80703: variable '__network_rh_distros' from source: role '' defaults 44071 1727204692.80709: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.80721: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204692.80843: variable 'ansible_distribution' from source: facts 44071 1727204692.80847: variable '__network_rh_distros' from source: role '' defaults 44071 1727204692.80849: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.80857: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204692.80979: variable 'ansible_distribution' from source: facts 44071 1727204692.80983: variable '__network_rh_distros' from source: role '' defaults 44071 1727204692.80986: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.81016: variable 'network_provider' from source: set_fact 44071 1727204692.81028: variable 'ansible_facts' from source: unknown 44071 1727204692.81975: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44071 1727204692.81980: when evaluation is False, skipping this task 44071 1727204692.81983: _execute() done 44071 1727204692.81985: dumping result to json 44071 1727204692.81986: done dumping result, returning 44071 1727204692.81989: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-c964-7471-0000000019c7] 44071 1727204692.81991: sending task result for task 127b8e07-fff9-c964-7471-0000000019c7 skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44071 1727204692.82140: no more pending results, returning what we have 44071 1727204692.82144: results queue empty 44071 1727204692.82145: checking for any_errors_fatal 44071 1727204692.82151: done checking for any_errors_fatal 44071 1727204692.82152: checking for max_fail_percentage 44071 1727204692.82154: done checking for max_fail_percentage 44071 1727204692.82155: checking to see if all hosts have failed and the running result is not ok 44071 1727204692.82156: done checking to see if all hosts have failed 44071 1727204692.82157: getting the remaining hosts for this loop 44071 1727204692.82158: done getting the remaining hosts for this loop 44071 1727204692.82164: getting the next task for host managed-node2 44071 1727204692.82177: done getting next task for host managed-node2 44071 1727204692.82270: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204692.82278: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204692.82309: getting variables 44071 1727204692.82311: in VariableManager get_vars() 44071 1727204692.82362: Calling all_inventory to load vars for managed-node2 44071 1727204692.82487: Calling groups_inventory to load vars for managed-node2 44071 1727204692.82497: done sending task result for task 127b8e07-fff9-c964-7471-0000000019c7 44071 1727204692.82509: WORKER PROCESS EXITING 44071 1727204692.82505: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204692.82523: Calling all_plugins_play to load vars for managed-node2 44071 1727204692.82527: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204692.82530: Calling groups_plugins_play to load vars for managed-node2 44071 1727204692.85016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204692.87396: done with get_vars() 44071 1727204692.87450: done getting variables 44071 1727204692.87530: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:04:52 -0400 (0:00:00.172) 0:01:45.192 ***** 44071 1727204692.87577: entering _queue_task() for managed-node2/package 44071 1727204692.88042: worker is 1 (out of 1 available) 44071 1727204692.88059: exiting _queue_task() for managed-node2/package 44071 1727204692.88080: done queuing things up, now waiting for results queue to drain 44071 1727204692.88082: waiting for pending results... 44071 1727204692.88441: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204692.88647: in run() - task 127b8e07-fff9-c964-7471-0000000019c8 44071 1727204692.88674: variable 'ansible_search_path' from source: unknown 44071 1727204692.88683: variable 'ansible_search_path' from source: unknown 44071 1727204692.88740: calling self._execute() 44071 1727204692.88871: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204692.88890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204692.88906: variable 'omit' from source: magic vars 44071 1727204692.89478: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.89483: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204692.89583: variable 'network_state' from source: role '' defaults 44071 1727204692.89606: Evaluated conditional (network_state != {}): False 44071 1727204692.89616: when evaluation is False, skipping this task 44071 1727204692.89623: _execute() done 44071 1727204692.89631: dumping result to json 44071 1727204692.89640: done dumping result, returning 44071 1727204692.89653: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-c964-7471-0000000019c8] 44071 1727204692.89663: sending task result for task 127b8e07-fff9-c964-7471-0000000019c8 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204692.89984: no more pending results, returning what we have 44071 1727204692.89990: results queue empty 44071 1727204692.89991: checking for any_errors_fatal 44071 1727204692.89998: done checking for any_errors_fatal 44071 1727204692.89999: checking for max_fail_percentage 44071 1727204692.90001: done checking for max_fail_percentage 44071 1727204692.90002: checking to see if all hosts have failed and the running result is not ok 44071 1727204692.90003: done checking to see if all hosts have failed 44071 1727204692.90003: getting the remaining hosts for this loop 44071 1727204692.90005: done getting the remaining hosts for this loop 44071 1727204692.90017: getting the next task for host managed-node2 44071 1727204692.90029: done getting next task for host managed-node2 44071 1727204692.90036: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204692.90044: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204692.90076: getting variables 44071 1727204692.90078: in VariableManager get_vars() 44071 1727204692.90180: Calling all_inventory to load vars for managed-node2 44071 1727204692.90183: Calling groups_inventory to load vars for managed-node2 44071 1727204692.90185: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204692.90201: Calling all_plugins_play to load vars for managed-node2 44071 1727204692.90204: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204692.90208: Calling groups_plugins_play to load vars for managed-node2 44071 1727204692.90735: done sending task result for task 127b8e07-fff9-c964-7471-0000000019c8 44071 1727204692.90739: WORKER PROCESS EXITING 44071 1727204692.91448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204692.93149: done with get_vars() 44071 1727204692.93198: done getting variables 44071 1727204692.93271: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:04:52 -0400 (0:00:00.057) 0:01:45.249 ***** 44071 1727204692.93312: entering _queue_task() for managed-node2/package 44071 1727204692.93664: worker is 1 (out of 1 available) 44071 1727204692.93681: exiting _queue_task() for managed-node2/package 44071 1727204692.93696: done queuing things up, now waiting for results queue to drain 44071 1727204692.93698: waiting for pending results... 44071 1727204692.93922: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204692.94062: in run() - task 127b8e07-fff9-c964-7471-0000000019c9 44071 1727204692.94077: variable 'ansible_search_path' from source: unknown 44071 1727204692.94081: variable 'ansible_search_path' from source: unknown 44071 1727204692.94114: calling self._execute() 44071 1727204692.94204: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204692.94209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204692.94217: variable 'omit' from source: magic vars 44071 1727204692.94549: variable 'ansible_distribution_major_version' from source: facts 44071 1727204692.94560: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204692.94669: variable 'network_state' from source: role '' defaults 44071 1727204692.94678: Evaluated conditional (network_state != {}): False 44071 1727204692.94682: when evaluation is False, skipping this task 44071 1727204692.94685: _execute() done 44071 1727204692.94688: dumping result to json 44071 1727204692.94690: done dumping result, returning 44071 1727204692.94702: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-c964-7471-0000000019c9] 44071 1727204692.94707: sending task result for task 127b8e07-fff9-c964-7471-0000000019c9 44071 1727204692.94825: done sending task result for task 127b8e07-fff9-c964-7471-0000000019c9 44071 1727204692.94828: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204692.94885: no more pending results, returning what we have 44071 1727204692.94889: results queue empty 44071 1727204692.94890: checking for any_errors_fatal 44071 1727204692.94899: done checking for any_errors_fatal 44071 1727204692.94899: checking for max_fail_percentage 44071 1727204692.94901: done checking for max_fail_percentage 44071 1727204692.94902: checking to see if all hosts have failed and the running result is not ok 44071 1727204692.94903: done checking to see if all hosts have failed 44071 1727204692.94903: getting the remaining hosts for this loop 44071 1727204692.94905: done getting the remaining hosts for this loop 44071 1727204692.94911: getting the next task for host managed-node2 44071 1727204692.94920: done getting next task for host managed-node2 44071 1727204692.94924: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204692.94931: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204692.94959: getting variables 44071 1727204692.94961: in VariableManager get_vars() 44071 1727204692.95006: Calling all_inventory to load vars for managed-node2 44071 1727204692.95009: Calling groups_inventory to load vars for managed-node2 44071 1727204692.95011: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204692.95024: Calling all_plugins_play to load vars for managed-node2 44071 1727204692.95027: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204692.95030: Calling groups_plugins_play to load vars for managed-node2 44071 1727204692.96740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204692.99049: done with get_vars() 44071 1727204692.99099: done getting variables 44071 1727204692.99172: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:04:52 -0400 (0:00:00.058) 0:01:45.308 ***** 44071 1727204692.99213: entering _queue_task() for managed-node2/service 44071 1727204692.99644: worker is 1 (out of 1 available) 44071 1727204692.99659: exiting _queue_task() for managed-node2/service 44071 1727204692.99675: done queuing things up, now waiting for results queue to drain 44071 1727204692.99677: waiting for pending results... 44071 1727204693.00079: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204693.00196: in run() - task 127b8e07-fff9-c964-7471-0000000019ca 44071 1727204693.00221: variable 'ansible_search_path' from source: unknown 44071 1727204693.00228: variable 'ansible_search_path' from source: unknown 44071 1727204693.00285: calling self._execute() 44071 1727204693.00422: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204693.00471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204693.00476: variable 'omit' from source: magic vars 44071 1727204693.00958: variable 'ansible_distribution_major_version' from source: facts 44071 1727204693.00985: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204693.01149: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204693.01420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204693.04330: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204693.04400: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204693.04463: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204693.04539: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204693.04553: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204693.04657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204693.04715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204693.04767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204693.04863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204693.04867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204693.04907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204693.04936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204693.04969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204693.05028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204693.05071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204693.05112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204693.05141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204693.05173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204693.05272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204693.05276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204693.05485: variable 'network_connections' from source: include params 44071 1727204693.05506: variable 'interface' from source: play vars 44071 1727204693.05603: variable 'interface' from source: play vars 44071 1727204693.05706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204693.05939: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204693.06071: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204693.06075: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204693.06078: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204693.06142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204693.06173: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204693.06219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204693.06252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204693.06327: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204693.06740: variable 'network_connections' from source: include params 44071 1727204693.06744: variable 'interface' from source: play vars 44071 1727204693.06747: variable 'interface' from source: play vars 44071 1727204693.06786: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204693.06795: when evaluation is False, skipping this task 44071 1727204693.06802: _execute() done 44071 1727204693.06809: dumping result to json 44071 1727204693.06818: done dumping result, returning 44071 1727204693.06830: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-0000000019ca] 44071 1727204693.06841: sending task result for task 127b8e07-fff9-c964-7471-0000000019ca skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204693.07132: no more pending results, returning what we have 44071 1727204693.07137: results queue empty 44071 1727204693.07138: checking for any_errors_fatal 44071 1727204693.07145: done checking for any_errors_fatal 44071 1727204693.07146: checking for max_fail_percentage 44071 1727204693.07148: done checking for max_fail_percentage 44071 1727204693.07149: checking to see if all hosts have failed and the running result is not ok 44071 1727204693.07150: done checking to see if all hosts have failed 44071 1727204693.07151: getting the remaining hosts for this loop 44071 1727204693.07152: done getting the remaining hosts for this loop 44071 1727204693.07158: getting the next task for host managed-node2 44071 1727204693.07171: done getting next task for host managed-node2 44071 1727204693.07176: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204693.07182: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204693.07208: getting variables 44071 1727204693.07210: in VariableManager get_vars() 44071 1727204693.07263: Calling all_inventory to load vars for managed-node2 44071 1727204693.07470: Calling groups_inventory to load vars for managed-node2 44071 1727204693.07475: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204693.07489: Calling all_plugins_play to load vars for managed-node2 44071 1727204693.07507: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204693.07515: done sending task result for task 127b8e07-fff9-c964-7471-0000000019ca 44071 1727204693.07519: WORKER PROCESS EXITING 44071 1727204693.07524: Calling groups_plugins_play to load vars for managed-node2 44071 1727204693.09655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204693.11971: done with get_vars() 44071 1727204693.12027: done getting variables 44071 1727204693.12105: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:04:53 -0400 (0:00:00.129) 0:01:45.437 ***** 44071 1727204693.12145: entering _queue_task() for managed-node2/service 44071 1727204693.12772: worker is 1 (out of 1 available) 44071 1727204693.12785: exiting _queue_task() for managed-node2/service 44071 1727204693.12796: done queuing things up, now waiting for results queue to drain 44071 1727204693.12798: waiting for pending results... 44071 1727204693.12961: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204693.13168: in run() - task 127b8e07-fff9-c964-7471-0000000019cb 44071 1727204693.13194: variable 'ansible_search_path' from source: unknown 44071 1727204693.13202: variable 'ansible_search_path' from source: unknown 44071 1727204693.13255: calling self._execute() 44071 1727204693.13381: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204693.13395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204693.13410: variable 'omit' from source: magic vars 44071 1727204693.13863: variable 'ansible_distribution_major_version' from source: facts 44071 1727204693.13901: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204693.14119: variable 'network_provider' from source: set_fact 44071 1727204693.14123: variable 'network_state' from source: role '' defaults 44071 1727204693.14128: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44071 1727204693.14146: variable 'omit' from source: magic vars 44071 1727204693.14271: variable 'omit' from source: magic vars 44071 1727204693.14275: variable 'network_service_name' from source: role '' defaults 44071 1727204693.14350: variable 'network_service_name' from source: role '' defaults 44071 1727204693.14487: variable '__network_provider_setup' from source: role '' defaults 44071 1727204693.14499: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204693.14583: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204693.14597: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204693.14771: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204693.14928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204693.17600: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204693.17693: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204693.17750: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204693.17812: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204693.17872: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204693.17956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204693.17998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204693.18043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204693.18152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204693.18156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204693.18180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204693.18210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204693.18244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204693.18298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204693.18319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204693.18613: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204693.18769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204693.18805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204693.18872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204693.18897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204693.18917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204693.19030: variable 'ansible_python' from source: facts 44071 1727204693.19059: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204693.19164: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204693.19261: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204693.19421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204693.19572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204693.19576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204693.19579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204693.19581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204693.19622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204693.19664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204693.19705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204693.19757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204693.19782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204693.19961: variable 'network_connections' from source: include params 44071 1727204693.19978: variable 'interface' from source: play vars 44071 1727204693.20078: variable 'interface' from source: play vars 44071 1727204693.20215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204693.20457: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204693.20566: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204693.20582: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204693.20635: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204693.20741: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204693.20788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204693.20834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204693.20875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204693.20941: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204693.21330: variable 'network_connections' from source: include params 44071 1727204693.21336: variable 'interface' from source: play vars 44071 1727204693.21384: variable 'interface' from source: play vars 44071 1727204693.21424: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204693.21525: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204693.21870: variable 'network_connections' from source: include params 44071 1727204693.21885: variable 'interface' from source: play vars 44071 1727204693.21964: variable 'interface' from source: play vars 44071 1727204693.22073: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204693.22101: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204693.22443: variable 'network_connections' from source: include params 44071 1727204693.22454: variable 'interface' from source: play vars 44071 1727204693.22540: variable 'interface' from source: play vars 44071 1727204693.22612: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204693.22688: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204693.22700: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204693.22775: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204693.23026: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204693.23628: variable 'network_connections' from source: include params 44071 1727204693.23644: variable 'interface' from source: play vars 44071 1727204693.23727: variable 'interface' from source: play vars 44071 1727204693.23777: variable 'ansible_distribution' from source: facts 44071 1727204693.23780: variable '__network_rh_distros' from source: role '' defaults 44071 1727204693.23782: variable 'ansible_distribution_major_version' from source: facts 44071 1727204693.23785: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204693.23990: variable 'ansible_distribution' from source: facts 44071 1727204693.24000: variable '__network_rh_distros' from source: role '' defaults 44071 1727204693.24010: variable 'ansible_distribution_major_version' from source: facts 44071 1727204693.24072: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204693.24236: variable 'ansible_distribution' from source: facts 44071 1727204693.24246: variable '__network_rh_distros' from source: role '' defaults 44071 1727204693.24255: variable 'ansible_distribution_major_version' from source: facts 44071 1727204693.24304: variable 'network_provider' from source: set_fact 44071 1727204693.24339: variable 'omit' from source: magic vars 44071 1727204693.24403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204693.24422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204693.24450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204693.24478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204693.24512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204693.24541: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204693.24550: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204693.24570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204693.24686: Set connection var ansible_connection to ssh 44071 1727204693.24728: Set connection var ansible_timeout to 10 44071 1727204693.24732: Set connection var ansible_pipelining to False 44071 1727204693.24737: Set connection var ansible_shell_type to sh 44071 1727204693.24739: Set connection var ansible_shell_executable to /bin/sh 44071 1727204693.24742: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204693.24775: variable 'ansible_shell_executable' from source: unknown 44071 1727204693.24871: variable 'ansible_connection' from source: unknown 44071 1727204693.24874: variable 'ansible_module_compression' from source: unknown 44071 1727204693.24877: variable 'ansible_shell_type' from source: unknown 44071 1727204693.24879: variable 'ansible_shell_executable' from source: unknown 44071 1727204693.24881: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204693.24883: variable 'ansible_pipelining' from source: unknown 44071 1727204693.24885: variable 'ansible_timeout' from source: unknown 44071 1727204693.24887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204693.24958: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204693.24984: variable 'omit' from source: magic vars 44071 1727204693.24998: starting attempt loop 44071 1727204693.25008: running the handler 44071 1727204693.25122: variable 'ansible_facts' from source: unknown 44071 1727204693.26231: _low_level_execute_command(): starting 44071 1727204693.26310: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204693.27035: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204693.27085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204693.27101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204693.27186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204693.27204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204693.27220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204693.27247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204693.27363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204693.29147: stdout chunk (state=3): >>>/root <<< 44071 1727204693.29475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204693.29479: stdout chunk (state=3): >>><<< 44071 1727204693.29482: stderr chunk (state=3): >>><<< 44071 1727204693.29485: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204693.29488: _low_level_execute_command(): starting 44071 1727204693.29491: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204693.2939749-50174-220855713323594 `" && echo ansible-tmp-1727204693.2939749-50174-220855713323594="` echo /root/.ansible/tmp/ansible-tmp-1727204693.2939749-50174-220855713323594 `" ) && sleep 0' 44071 1727204693.30118: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204693.30127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204693.30184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204693.30243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204693.30263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204693.30302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204693.30381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204693.32368: stdout chunk (state=3): >>>ansible-tmp-1727204693.2939749-50174-220855713323594=/root/.ansible/tmp/ansible-tmp-1727204693.2939749-50174-220855713323594 <<< 44071 1727204693.32575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204693.32615: stderr chunk (state=3): >>><<< 44071 1727204693.32625: stdout chunk (state=3): >>><<< 44071 1727204693.32700: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204693.2939749-50174-220855713323594=/root/.ansible/tmp/ansible-tmp-1727204693.2939749-50174-220855713323594 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204693.32703: variable 'ansible_module_compression' from source: unknown 44071 1727204693.32770: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44071 1727204693.32870: variable 'ansible_facts' from source: unknown 44071 1727204693.33149: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204693.2939749-50174-220855713323594/AnsiballZ_systemd.py 44071 1727204693.33417: Sending initial data 44071 1727204693.33420: Sent initial data (156 bytes) 44071 1727204693.34119: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204693.34124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204693.34135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204693.34193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204693.34201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204693.34278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204693.35895: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204693.35957: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204693.36024: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp1gto1d99 /root/.ansible/tmp/ansible-tmp-1727204693.2939749-50174-220855713323594/AnsiballZ_systemd.py <<< 44071 1727204693.36032: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204693.2939749-50174-220855713323594/AnsiballZ_systemd.py" <<< 44071 1727204693.36093: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp1gto1d99" to remote "/root/.ansible/tmp/ansible-tmp-1727204693.2939749-50174-220855713323594/AnsiballZ_systemd.py" <<< 44071 1727204693.36097: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204693.2939749-50174-220855713323594/AnsiballZ_systemd.py" <<< 44071 1727204693.37923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204693.37928: stderr chunk (state=3): >>><<< 44071 1727204693.37930: stdout chunk (state=3): >>><<< 44071 1727204693.37998: done transferring module to remote 44071 1727204693.38002: _low_level_execute_command(): starting 44071 1727204693.38005: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204693.2939749-50174-220855713323594/ /root/.ansible/tmp/ansible-tmp-1727204693.2939749-50174-220855713323594/AnsiballZ_systemd.py && sleep 0' 44071 1727204693.38692: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204693.38754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204693.38847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204693.38882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204693.38979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204693.40819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204693.40888: stderr chunk (state=3): >>><<< 44071 1727204693.40892: stdout chunk (state=3): >>><<< 44071 1727204693.40908: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204693.40913: _low_level_execute_command(): starting 44071 1727204693.40916: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204693.2939749-50174-220855713323594/AnsiballZ_systemd.py && sleep 0' 44071 1727204693.41609: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204693.41613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204693.41617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204693.41619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204693.41716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204693.41721: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204693.41724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204693.41726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204693.41740: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204693.41758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204693.41878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204693.73958: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4595712", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3509596160", "CPUUsageNSec": "1625560000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitC<<< 44071 1727204693.73973: stdout chunk (state=3): >>>ORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44071 1727204693.75980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204693.76053: stderr chunk (state=3): >>><<< 44071 1727204693.76158: stdout chunk (state=3): >>><<< 44071 1727204693.76164: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4595712", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3509596160", "CPUUsageNSec": "1625560000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204693.76476: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204693.2939749-50174-220855713323594/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204693.76515: _low_level_execute_command(): starting 44071 1727204693.76519: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204693.2939749-50174-220855713323594/ > /dev/null 2>&1 && sleep 0' 44071 1727204693.77255: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204693.77377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204693.77404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204693.77522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204693.81973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204693.81978: stdout chunk (state=3): >>><<< 44071 1727204693.81982: stderr chunk (state=3): >>><<< 44071 1727204693.81986: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204693.81989: handler run complete 44071 1727204693.81992: attempt loop complete, returning result 44071 1727204693.81994: _execute() done 44071 1727204693.81997: dumping result to json 44071 1727204693.82000: done dumping result, returning 44071 1727204693.82003: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-c964-7471-0000000019cb] 44071 1727204693.82006: sending task result for task 127b8e07-fff9-c964-7471-0000000019cb ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204693.82501: no more pending results, returning what we have 44071 1727204693.82506: results queue empty 44071 1727204693.82507: checking for any_errors_fatal 44071 1727204693.82515: done checking for any_errors_fatal 44071 1727204693.82516: checking for max_fail_percentage 44071 1727204693.82518: done checking for max_fail_percentage 44071 1727204693.82519: checking to see if all hosts have failed and the running result is not ok 44071 1727204693.82520: done checking to see if all hosts have failed 44071 1727204693.82521: getting the remaining hosts for this loop 44071 1727204693.82523: done getting the remaining hosts for this loop 44071 1727204693.82529: getting the next task for host managed-node2 44071 1727204693.82539: done getting next task for host managed-node2 44071 1727204693.82543: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204693.82549: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204693.82771: getting variables 44071 1727204693.82774: in VariableManager get_vars() 44071 1727204693.82831: Calling all_inventory to load vars for managed-node2 44071 1727204693.82835: Calling groups_inventory to load vars for managed-node2 44071 1727204693.82894: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204693.82909: Calling all_plugins_play to load vars for managed-node2 44071 1727204693.82914: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204693.82917: Calling groups_plugins_play to load vars for managed-node2 44071 1727204693.83982: done sending task result for task 127b8e07-fff9-c964-7471-0000000019cb 44071 1727204693.83988: WORKER PROCESS EXITING 44071 1727204693.87563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204693.91214: done with get_vars() 44071 1727204693.91285: done getting variables 44071 1727204693.91376: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:04:53 -0400 (0:00:00.792) 0:01:46.230 ***** 44071 1727204693.91424: entering _queue_task() for managed-node2/service 44071 1727204693.92033: worker is 1 (out of 1 available) 44071 1727204693.92048: exiting _queue_task() for managed-node2/service 44071 1727204693.92063: done queuing things up, now waiting for results queue to drain 44071 1727204693.92064: waiting for pending results... 44071 1727204693.92327: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204693.92460: in run() - task 127b8e07-fff9-c964-7471-0000000019cc 44071 1727204693.92476: variable 'ansible_search_path' from source: unknown 44071 1727204693.92480: variable 'ansible_search_path' from source: unknown 44071 1727204693.92514: calling self._execute() 44071 1727204693.92607: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204693.92615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204693.92623: variable 'omit' from source: magic vars 44071 1727204693.92969: variable 'ansible_distribution_major_version' from source: facts 44071 1727204693.92982: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204693.93082: variable 'network_provider' from source: set_fact 44071 1727204693.93087: Evaluated conditional (network_provider == "nm"): True 44071 1727204693.93163: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204693.93234: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204693.93374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204693.96574: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204693.96579: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204693.96582: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204693.96585: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204693.96625: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204693.96734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204693.96780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204693.96809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204693.96875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204693.96891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204693.96961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204693.96994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204693.97020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204693.97084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204693.97100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204693.97151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204693.97193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204693.97221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204693.97279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204693.97301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204693.97502: variable 'network_connections' from source: include params 44071 1727204693.97521: variable 'interface' from source: play vars 44071 1727204693.97617: variable 'interface' from source: play vars 44071 1727204693.97753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204693.97986: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204693.98034: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204693.98100: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204693.98163: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204693.98470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204693.98474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204693.98478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204693.98480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204693.98482: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204693.98697: variable 'network_connections' from source: include params 44071 1727204693.98703: variable 'interface' from source: play vars 44071 1727204693.98789: variable 'interface' from source: play vars 44071 1727204693.98833: Evaluated conditional (__network_wpa_supplicant_required): False 44071 1727204693.98839: when evaluation is False, skipping this task 44071 1727204693.98842: _execute() done 44071 1727204693.98845: dumping result to json 44071 1727204693.98870: done dumping result, returning 44071 1727204693.98873: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-c964-7471-0000000019cc] 44071 1727204693.98884: sending task result for task 127b8e07-fff9-c964-7471-0000000019cc skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44071 1727204693.99281: no more pending results, returning what we have 44071 1727204693.99284: results queue empty 44071 1727204693.99285: checking for any_errors_fatal 44071 1727204693.99303: done checking for any_errors_fatal 44071 1727204693.99304: checking for max_fail_percentage 44071 1727204693.99306: done checking for max_fail_percentage 44071 1727204693.99307: checking to see if all hosts have failed and the running result is not ok 44071 1727204693.99308: done checking to see if all hosts have failed 44071 1727204693.99309: getting the remaining hosts for this loop 44071 1727204693.99310: done getting the remaining hosts for this loop 44071 1727204693.99314: getting the next task for host managed-node2 44071 1727204693.99322: done getting next task for host managed-node2 44071 1727204693.99327: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204693.99332: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204693.99360: getting variables 44071 1727204693.99362: in VariableManager get_vars() 44071 1727204693.99408: Calling all_inventory to load vars for managed-node2 44071 1727204693.99412: Calling groups_inventory to load vars for managed-node2 44071 1727204693.99414: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204693.99425: Calling all_plugins_play to load vars for managed-node2 44071 1727204693.99428: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204693.99431: Calling groups_plugins_play to load vars for managed-node2 44071 1727204693.99956: done sending task result for task 127b8e07-fff9-c964-7471-0000000019cc 44071 1727204693.99962: WORKER PROCESS EXITING 44071 1727204694.01771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204694.04350: done with get_vars() 44071 1727204694.04401: done getting variables 44071 1727204694.04524: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:04:54 -0400 (0:00:00.131) 0:01:46.362 ***** 44071 1727204694.04565: entering _queue_task() for managed-node2/service 44071 1727204694.05404: worker is 1 (out of 1 available) 44071 1727204694.05420: exiting _queue_task() for managed-node2/service 44071 1727204694.05434: done queuing things up, now waiting for results queue to drain 44071 1727204694.05436: waiting for pending results... 44071 1727204694.05941: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204694.06130: in run() - task 127b8e07-fff9-c964-7471-0000000019cd 44071 1727204694.06156: variable 'ansible_search_path' from source: unknown 44071 1727204694.06165: variable 'ansible_search_path' from source: unknown 44071 1727204694.06216: calling self._execute() 44071 1727204694.06333: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204694.06347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204694.06362: variable 'omit' from source: magic vars 44071 1727204694.06817: variable 'ansible_distribution_major_version' from source: facts 44071 1727204694.06845: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204694.06988: variable 'network_provider' from source: set_fact 44071 1727204694.07055: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204694.07058: when evaluation is False, skipping this task 44071 1727204694.07061: _execute() done 44071 1727204694.07064: dumping result to json 44071 1727204694.07067: done dumping result, returning 44071 1727204694.07071: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-c964-7471-0000000019cd] 44071 1727204694.07073: sending task result for task 127b8e07-fff9-c964-7471-0000000019cd skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204694.07444: no more pending results, returning what we have 44071 1727204694.07451: results queue empty 44071 1727204694.07452: checking for any_errors_fatal 44071 1727204694.07470: done checking for any_errors_fatal 44071 1727204694.07471: checking for max_fail_percentage 44071 1727204694.07474: done checking for max_fail_percentage 44071 1727204694.07475: checking to see if all hosts have failed and the running result is not ok 44071 1727204694.07476: done checking to see if all hosts have failed 44071 1727204694.07477: getting the remaining hosts for this loop 44071 1727204694.07478: done getting the remaining hosts for this loop 44071 1727204694.07487: getting the next task for host managed-node2 44071 1727204694.07500: done getting next task for host managed-node2 44071 1727204694.07505: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204694.07512: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204694.07549: getting variables 44071 1727204694.07552: in VariableManager get_vars() 44071 1727204694.07808: Calling all_inventory to load vars for managed-node2 44071 1727204694.07814: Calling groups_inventory to load vars for managed-node2 44071 1727204694.07817: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204694.07828: Calling all_plugins_play to load vars for managed-node2 44071 1727204694.07832: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204694.07838: Calling groups_plugins_play to load vars for managed-node2 44071 1727204694.08386: done sending task result for task 127b8e07-fff9-c964-7471-0000000019cd 44071 1727204694.08391: WORKER PROCESS EXITING 44071 1727204694.10679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204694.15197: done with get_vars() 44071 1727204694.15249: done getting variables 44071 1727204694.15623: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:04:54 -0400 (0:00:00.110) 0:01:46.473 ***** 44071 1727204694.15669: entering _queue_task() for managed-node2/copy 44071 1727204694.16498: worker is 1 (out of 1 available) 44071 1727204694.16514: exiting _queue_task() for managed-node2/copy 44071 1727204694.16531: done queuing things up, now waiting for results queue to drain 44071 1727204694.16535: waiting for pending results... 44071 1727204694.17108: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204694.17383: in run() - task 127b8e07-fff9-c964-7471-0000000019ce 44071 1727204694.17412: variable 'ansible_search_path' from source: unknown 44071 1727204694.17431: variable 'ansible_search_path' from source: unknown 44071 1727204694.17484: calling self._execute() 44071 1727204694.17984: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204694.17988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204694.17991: variable 'omit' from source: magic vars 44071 1727204694.18618: variable 'ansible_distribution_major_version' from source: facts 44071 1727204694.18651: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204694.18837: variable 'network_provider' from source: set_fact 44071 1727204694.18842: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204694.18848: when evaluation is False, skipping this task 44071 1727204694.18851: _execute() done 44071 1727204694.18856: dumping result to json 44071 1727204694.18858: done dumping result, returning 44071 1727204694.18874: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-c964-7471-0000000019ce] 44071 1727204694.18877: sending task result for task 127b8e07-fff9-c964-7471-0000000019ce skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44071 1727204694.19043: no more pending results, returning what we have 44071 1727204694.19048: results queue empty 44071 1727204694.19049: checking for any_errors_fatal 44071 1727204694.19056: done checking for any_errors_fatal 44071 1727204694.19056: checking for max_fail_percentage 44071 1727204694.19058: done checking for max_fail_percentage 44071 1727204694.19059: checking to see if all hosts have failed and the running result is not ok 44071 1727204694.19059: done checking to see if all hosts have failed 44071 1727204694.19060: getting the remaining hosts for this loop 44071 1727204694.19062: done getting the remaining hosts for this loop 44071 1727204694.19068: getting the next task for host managed-node2 44071 1727204694.19078: done getting next task for host managed-node2 44071 1727204694.19083: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204694.19089: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204694.19105: done sending task result for task 127b8e07-fff9-c964-7471-0000000019ce 44071 1727204694.19109: WORKER PROCESS EXITING 44071 1727204694.19127: getting variables 44071 1727204694.19129: in VariableManager get_vars() 44071 1727204694.19176: Calling all_inventory to load vars for managed-node2 44071 1727204694.19179: Calling groups_inventory to load vars for managed-node2 44071 1727204694.19181: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204694.19203: Calling all_plugins_play to load vars for managed-node2 44071 1727204694.19231: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204694.19235: Calling groups_plugins_play to load vars for managed-node2 44071 1727204694.20296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204694.23381: done with get_vars() 44071 1727204694.23426: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:04:54 -0400 (0:00:00.078) 0:01:46.551 ***** 44071 1727204694.23553: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204694.24582: worker is 1 (out of 1 available) 44071 1727204694.24598: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204694.24615: done queuing things up, now waiting for results queue to drain 44071 1727204694.24617: waiting for pending results... 44071 1727204694.25109: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204694.25406: in run() - task 127b8e07-fff9-c964-7471-0000000019cf 44071 1727204694.25416: variable 'ansible_search_path' from source: unknown 44071 1727204694.25419: variable 'ansible_search_path' from source: unknown 44071 1727204694.25448: calling self._execute() 44071 1727204694.25591: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204694.25606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204694.25636: variable 'omit' from source: magic vars 44071 1727204694.26181: variable 'ansible_distribution_major_version' from source: facts 44071 1727204694.26187: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204694.26190: variable 'omit' from source: magic vars 44071 1727204694.26244: variable 'omit' from source: magic vars 44071 1727204694.26455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204694.40793: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204694.40798: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204694.40837: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204694.40869: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204694.40913: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204694.41048: variable 'network_provider' from source: set_fact 44071 1727204694.41212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204694.41217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204694.41273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204694.41277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204694.41291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204694.41392: variable 'omit' from source: magic vars 44071 1727204694.41517: variable 'omit' from source: magic vars 44071 1727204694.41639: variable 'network_connections' from source: include params 44071 1727204694.41650: variable 'interface' from source: play vars 44071 1727204694.41723: variable 'interface' from source: play vars 44071 1727204694.41870: variable 'omit' from source: magic vars 44071 1727204694.41882: variable '__lsr_ansible_managed' from source: task vars 44071 1727204694.41948: variable '__lsr_ansible_managed' from source: task vars 44071 1727204694.42159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44071 1727204694.42472: Loaded config def from plugin (lookup/template) 44071 1727204694.42476: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44071 1727204694.42478: File lookup term: get_ansible_managed.j2 44071 1727204694.42481: variable 'ansible_search_path' from source: unknown 44071 1727204694.42484: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44071 1727204694.42487: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44071 1727204694.42490: variable 'ansible_search_path' from source: unknown 44071 1727204694.53576: variable 'ansible_managed' from source: unknown 44071 1727204694.53652: variable 'omit' from source: magic vars 44071 1727204694.53690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204694.53713: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204694.53871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204694.53874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204694.53877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204694.53879: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204694.53882: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204694.53884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204694.53887: Set connection var ansible_connection to ssh 44071 1727204694.53889: Set connection var ansible_timeout to 10 44071 1727204694.53891: Set connection var ansible_pipelining to False 44071 1727204694.53895: Set connection var ansible_shell_type to sh 44071 1727204694.53897: Set connection var ansible_shell_executable to /bin/sh 44071 1727204694.53901: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204694.53926: variable 'ansible_shell_executable' from source: unknown 44071 1727204694.53929: variable 'ansible_connection' from source: unknown 44071 1727204694.53931: variable 'ansible_module_compression' from source: unknown 44071 1727204694.53936: variable 'ansible_shell_type' from source: unknown 44071 1727204694.53939: variable 'ansible_shell_executable' from source: unknown 44071 1727204694.53942: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204694.53944: variable 'ansible_pipelining' from source: unknown 44071 1727204694.53946: variable 'ansible_timeout' from source: unknown 44071 1727204694.53951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204694.54132: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204694.54146: variable 'omit' from source: magic vars 44071 1727204694.54149: starting attempt loop 44071 1727204694.54151: running the handler 44071 1727204694.54154: _low_level_execute_command(): starting 44071 1727204694.54156: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204694.54874: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204694.54885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204694.54888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204694.54896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204694.54899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204694.54902: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204694.54904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204694.54906: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204694.54909: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204694.54911: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204694.54913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204694.54915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204694.54996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204694.55004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204694.55008: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204694.55010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204694.55012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204694.55026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204694.55042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204694.55430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204694.57250: stdout chunk (state=3): >>>/root <<< 44071 1727204694.57288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204694.57364: stderr chunk (state=3): >>><<< 44071 1727204694.57370: stdout chunk (state=3): >>><<< 44071 1727204694.57394: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204694.57407: _low_level_execute_command(): starting 44071 1727204694.57415: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204694.5739486-50217-131830658853469 `" && echo ansible-tmp-1727204694.5739486-50217-131830658853469="` echo /root/.ansible/tmp/ansible-tmp-1727204694.5739486-50217-131830658853469 `" ) && sleep 0' 44071 1727204694.58715: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204694.58720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204694.58773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204694.58777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204694.58927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204694.58930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204694.58988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204694.59088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204694.61110: stdout chunk (state=3): >>>ansible-tmp-1727204694.5739486-50217-131830658853469=/root/.ansible/tmp/ansible-tmp-1727204694.5739486-50217-131830658853469 <<< 44071 1727204694.61209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204694.61484: stderr chunk (state=3): >>><<< 44071 1727204694.61487: stdout chunk (state=3): >>><<< 44071 1727204694.61508: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204694.5739486-50217-131830658853469=/root/.ansible/tmp/ansible-tmp-1727204694.5739486-50217-131830658853469 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204694.61559: variable 'ansible_module_compression' from source: unknown 44071 1727204694.61623: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44071 1727204694.61631: variable 'ansible_facts' from source: unknown 44071 1727204694.61924: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204694.5739486-50217-131830658853469/AnsiballZ_network_connections.py 44071 1727204694.62440: Sending initial data 44071 1727204694.62444: Sent initial data (168 bytes) 44071 1727204694.63633: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204694.63661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204694.63975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204694.63984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204694.64221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204694.65884: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204694.66095: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204694.66164: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpz1mu22j1 /root/.ansible/tmp/ansible-tmp-1727204694.5739486-50217-131830658853469/AnsiballZ_network_connections.py <<< 44071 1727204694.66187: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204694.5739486-50217-131830658853469/AnsiballZ_network_connections.py" <<< 44071 1727204694.66232: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpz1mu22j1" to remote "/root/.ansible/tmp/ansible-tmp-1727204694.5739486-50217-131830658853469/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204694.5739486-50217-131830658853469/AnsiballZ_network_connections.py" <<< 44071 1727204694.68175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204694.68179: stderr chunk (state=3): >>><<< 44071 1727204694.68182: stdout chunk (state=3): >>><<< 44071 1727204694.68203: done transferring module to remote 44071 1727204694.68228: _low_level_execute_command(): starting 44071 1727204694.68449: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204694.5739486-50217-131830658853469/ /root/.ansible/tmp/ansible-tmp-1727204694.5739486-50217-131830658853469/AnsiballZ_network_connections.py && sleep 0' 44071 1727204694.69204: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204694.69212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204694.69245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204694.69249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204694.69277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204694.69285: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204694.69353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204694.69548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204694.69604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204694.69985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204694.72018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204694.72023: stderr chunk (state=3): >>><<< 44071 1727204694.72025: stdout chunk (state=3): >>><<< 44071 1727204694.72028: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204694.72031: _low_level_execute_command(): starting 44071 1727204694.72034: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204694.5739486-50217-131830658853469/AnsiballZ_network_connections.py && sleep 0' 44071 1727204694.73168: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204694.73495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204694.73537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204694.73656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204695.01302: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2461fed0-dcf1-466d-b59f-3f5d810ecefa skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44071 1727204695.03222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204695.03328: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 44071 1727204695.03334: stdout chunk (state=3): >>><<< 44071 1727204695.03449: stderr chunk (state=3): >>><<< 44071 1727204695.03453: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2461fed0-dcf1-466d-b59f-3f5d810ecefa skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204695.03456: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204694.5739486-50217-131830658853469/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204695.03459: _low_level_execute_command(): starting 44071 1727204695.03462: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204694.5739486-50217-131830658853469/ > /dev/null 2>&1 && sleep 0' 44071 1727204695.04114: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204695.04122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204695.04133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204695.04153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204695.04167: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204695.04176: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204695.04186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204695.04200: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204695.04231: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204695.04234: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204695.04242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204695.04245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204695.04247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204695.04249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204695.04338: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204695.04341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204695.04344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204695.04362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204695.04380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204695.04491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204695.06604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204695.06608: stdout chunk (state=3): >>><<< 44071 1727204695.06773: stderr chunk (state=3): >>><<< 44071 1727204695.06782: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204695.06791: handler run complete 44071 1727204695.06793: attempt loop complete, returning result 44071 1727204695.06795: _execute() done 44071 1727204695.06796: dumping result to json 44071 1727204695.06798: done dumping result, returning 44071 1727204695.06800: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-c964-7471-0000000019cf] 44071 1727204695.06802: sending task result for task 127b8e07-fff9-c964-7471-0000000019cf 44071 1727204695.06884: done sending task result for task 127b8e07-fff9-c964-7471-0000000019cf 44071 1727204695.06888: WORKER PROCESS EXITING ok: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2461fed0-dcf1-466d-b59f-3f5d810ecefa skipped because already active 44071 1727204695.07097: no more pending results, returning what we have 44071 1727204695.07102: results queue empty 44071 1727204695.07103: checking for any_errors_fatal 44071 1727204695.07108: done checking for any_errors_fatal 44071 1727204695.07109: checking for max_fail_percentage 44071 1727204695.07111: done checking for max_fail_percentage 44071 1727204695.07112: checking to see if all hosts have failed and the running result is not ok 44071 1727204695.07113: done checking to see if all hosts have failed 44071 1727204695.07114: getting the remaining hosts for this loop 44071 1727204695.07115: done getting the remaining hosts for this loop 44071 1727204695.07120: getting the next task for host managed-node2 44071 1727204695.07128: done getting next task for host managed-node2 44071 1727204695.07135: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204695.07140: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204695.07156: getting variables 44071 1727204695.07157: in VariableManager get_vars() 44071 1727204695.07328: Calling all_inventory to load vars for managed-node2 44071 1727204695.07332: Calling groups_inventory to load vars for managed-node2 44071 1727204695.07337: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204695.07349: Calling all_plugins_play to load vars for managed-node2 44071 1727204695.07353: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204695.07356: Calling groups_plugins_play to load vars for managed-node2 44071 1727204695.17938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204695.20247: done with get_vars() 44071 1727204695.20292: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.968) 0:01:47.520 ***** 44071 1727204695.20389: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204695.20837: worker is 1 (out of 1 available) 44071 1727204695.20854: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204695.21083: done queuing things up, now waiting for results queue to drain 44071 1727204695.21086: waiting for pending results... 44071 1727204695.21237: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204695.21432: in run() - task 127b8e07-fff9-c964-7471-0000000019d0 44071 1727204695.21531: variable 'ansible_search_path' from source: unknown 44071 1727204695.21536: variable 'ansible_search_path' from source: unknown 44071 1727204695.21541: calling self._execute() 44071 1727204695.21644: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204695.21660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204695.21679: variable 'omit' from source: magic vars 44071 1727204695.22144: variable 'ansible_distribution_major_version' from source: facts 44071 1727204695.22164: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204695.22322: variable 'network_state' from source: role '' defaults 44071 1727204695.22338: Evaluated conditional (network_state != {}): False 44071 1727204695.22400: when evaluation is False, skipping this task 44071 1727204695.22404: _execute() done 44071 1727204695.22406: dumping result to json 44071 1727204695.22409: done dumping result, returning 44071 1727204695.22413: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-c964-7471-0000000019d0] 44071 1727204695.22417: sending task result for task 127b8e07-fff9-c964-7471-0000000019d0 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204695.22577: no more pending results, returning what we have 44071 1727204695.22582: results queue empty 44071 1727204695.22584: checking for any_errors_fatal 44071 1727204695.22601: done checking for any_errors_fatal 44071 1727204695.22602: checking for max_fail_percentage 44071 1727204695.22603: done checking for max_fail_percentage 44071 1727204695.22605: checking to see if all hosts have failed and the running result is not ok 44071 1727204695.22605: done checking to see if all hosts have failed 44071 1727204695.22606: getting the remaining hosts for this loop 44071 1727204695.22608: done getting the remaining hosts for this loop 44071 1727204695.22613: getting the next task for host managed-node2 44071 1727204695.22623: done getting next task for host managed-node2 44071 1727204695.22627: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204695.22634: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204695.22668: getting variables 44071 1727204695.22670: in VariableManager get_vars() 44071 1727204695.22722: Calling all_inventory to load vars for managed-node2 44071 1727204695.22726: Calling groups_inventory to load vars for managed-node2 44071 1727204695.22729: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204695.22744: Calling all_plugins_play to load vars for managed-node2 44071 1727204695.22748: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204695.22751: Calling groups_plugins_play to load vars for managed-node2 44071 1727204695.23585: done sending task result for task 127b8e07-fff9-c964-7471-0000000019d0 44071 1727204695.23590: WORKER PROCESS EXITING 44071 1727204695.25246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204695.27685: done with get_vars() 44071 1727204695.27726: done getting variables 44071 1727204695.27804: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.074) 0:01:47.594 ***** 44071 1727204695.27847: entering _queue_task() for managed-node2/debug 44071 1727204695.28298: worker is 1 (out of 1 available) 44071 1727204695.28316: exiting _queue_task() for managed-node2/debug 44071 1727204695.28331: done queuing things up, now waiting for results queue to drain 44071 1727204695.28333: waiting for pending results... 44071 1727204695.28638: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204695.28824: in run() - task 127b8e07-fff9-c964-7471-0000000019d1 44071 1727204695.28845: variable 'ansible_search_path' from source: unknown 44071 1727204695.28850: variable 'ansible_search_path' from source: unknown 44071 1727204695.28893: calling self._execute() 44071 1727204695.29012: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204695.29019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204695.29040: variable 'omit' from source: magic vars 44071 1727204695.29573: variable 'ansible_distribution_major_version' from source: facts 44071 1727204695.29578: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204695.29581: variable 'omit' from source: magic vars 44071 1727204695.29642: variable 'omit' from source: magic vars 44071 1727204695.29689: variable 'omit' from source: magic vars 44071 1727204695.29743: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204695.29809: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204695.29820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204695.29844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204695.29858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204695.29892: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204695.29896: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204695.29917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204695.30038: Set connection var ansible_connection to ssh 44071 1727204695.30044: Set connection var ansible_timeout to 10 44071 1727204695.30051: Set connection var ansible_pipelining to False 44071 1727204695.30057: Set connection var ansible_shell_type to sh 44071 1727204695.30064: Set connection var ansible_shell_executable to /bin/sh 44071 1727204695.30243: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204695.30247: variable 'ansible_shell_executable' from source: unknown 44071 1727204695.30250: variable 'ansible_connection' from source: unknown 44071 1727204695.30253: variable 'ansible_module_compression' from source: unknown 44071 1727204695.30257: variable 'ansible_shell_type' from source: unknown 44071 1727204695.30260: variable 'ansible_shell_executable' from source: unknown 44071 1727204695.30263: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204695.30268: variable 'ansible_pipelining' from source: unknown 44071 1727204695.30272: variable 'ansible_timeout' from source: unknown 44071 1727204695.30274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204695.30312: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204695.30324: variable 'omit' from source: magic vars 44071 1727204695.30330: starting attempt loop 44071 1727204695.30340: running the handler 44071 1727204695.30504: variable '__network_connections_result' from source: set_fact 44071 1727204695.30577: handler run complete 44071 1727204695.30598: attempt loop complete, returning result 44071 1727204695.30607: _execute() done 44071 1727204695.30611: dumping result to json 44071 1727204695.30614: done dumping result, returning 44071 1727204695.30623: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-c964-7471-0000000019d1] 44071 1727204695.30628: sending task result for task 127b8e07-fff9-c964-7471-0000000019d1 44071 1727204695.30914: done sending task result for task 127b8e07-fff9-c964-7471-0000000019d1 44071 1727204695.30919: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2461fed0-dcf1-466d-b59f-3f5d810ecefa skipped because already active" ] } 44071 1727204695.30988: no more pending results, returning what we have 44071 1727204695.30992: results queue empty 44071 1727204695.30992: checking for any_errors_fatal 44071 1727204695.30997: done checking for any_errors_fatal 44071 1727204695.30998: checking for max_fail_percentage 44071 1727204695.31000: done checking for max_fail_percentage 44071 1727204695.31001: checking to see if all hosts have failed and the running result is not ok 44071 1727204695.31003: done checking to see if all hosts have failed 44071 1727204695.31005: getting the remaining hosts for this loop 44071 1727204695.31007: done getting the remaining hosts for this loop 44071 1727204695.31010: getting the next task for host managed-node2 44071 1727204695.31018: done getting next task for host managed-node2 44071 1727204695.31022: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204695.31028: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204695.31040: getting variables 44071 1727204695.31041: in VariableManager get_vars() 44071 1727204695.31085: Calling all_inventory to load vars for managed-node2 44071 1727204695.31088: Calling groups_inventory to load vars for managed-node2 44071 1727204695.31091: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204695.31101: Calling all_plugins_play to load vars for managed-node2 44071 1727204695.31104: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204695.31108: Calling groups_plugins_play to load vars for managed-node2 44071 1727204695.32989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204695.35233: done with get_vars() 44071 1727204695.35276: done getting variables 44071 1727204695.35343: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.075) 0:01:47.670 ***** 44071 1727204695.35392: entering _queue_task() for managed-node2/debug 44071 1727204695.35794: worker is 1 (out of 1 available) 44071 1727204695.35808: exiting _queue_task() for managed-node2/debug 44071 1727204695.35823: done queuing things up, now waiting for results queue to drain 44071 1727204695.35824: waiting for pending results... 44071 1727204695.36163: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204695.36401: in run() - task 127b8e07-fff9-c964-7471-0000000019d2 44071 1727204695.36406: variable 'ansible_search_path' from source: unknown 44071 1727204695.36408: variable 'ansible_search_path' from source: unknown 44071 1727204695.36439: calling self._execute() 44071 1727204695.36562: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204695.36581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204695.36599: variable 'omit' from source: magic vars 44071 1727204695.37273: variable 'ansible_distribution_major_version' from source: facts 44071 1727204695.37279: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204695.37282: variable 'omit' from source: magic vars 44071 1727204695.37284: variable 'omit' from source: magic vars 44071 1727204695.37286: variable 'omit' from source: magic vars 44071 1727204695.37288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204695.37311: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204695.37338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204695.37360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204695.37379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204695.37418: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204695.37425: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204695.37432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204695.37541: Set connection var ansible_connection to ssh 44071 1727204695.37555: Set connection var ansible_timeout to 10 44071 1727204695.37569: Set connection var ansible_pipelining to False 44071 1727204695.37580: Set connection var ansible_shell_type to sh 44071 1727204695.37590: Set connection var ansible_shell_executable to /bin/sh 44071 1727204695.37602: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204695.37642: variable 'ansible_shell_executable' from source: unknown 44071 1727204695.37651: variable 'ansible_connection' from source: unknown 44071 1727204695.37659: variable 'ansible_module_compression' from source: unknown 44071 1727204695.37668: variable 'ansible_shell_type' from source: unknown 44071 1727204695.37675: variable 'ansible_shell_executable' from source: unknown 44071 1727204695.37681: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204695.37689: variable 'ansible_pipelining' from source: unknown 44071 1727204695.37696: variable 'ansible_timeout' from source: unknown 44071 1727204695.37705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204695.37890: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204695.37910: variable 'omit' from source: magic vars 44071 1727204695.37921: starting attempt loop 44071 1727204695.37929: running the handler 44071 1727204695.38023: variable '__network_connections_result' from source: set_fact 44071 1727204695.38138: variable '__network_connections_result' from source: set_fact 44071 1727204695.38298: handler run complete 44071 1727204695.38354: attempt loop complete, returning result 44071 1727204695.38362: _execute() done 44071 1727204695.38382: dumping result to json 44071 1727204695.38385: done dumping result, returning 44071 1727204695.38471: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-c964-7471-0000000019d2] 44071 1727204695.38475: sending task result for task 127b8e07-fff9-c964-7471-0000000019d2 44071 1727204695.38773: done sending task result for task 127b8e07-fff9-c964-7471-0000000019d2 44071 1727204695.38777: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2461fed0-dcf1-466d-b59f-3f5d810ecefa skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2461fed0-dcf1-466d-b59f-3f5d810ecefa skipped because already active" ] } } 44071 1727204695.38876: no more pending results, returning what we have 44071 1727204695.38880: results queue empty 44071 1727204695.38881: checking for any_errors_fatal 44071 1727204695.38887: done checking for any_errors_fatal 44071 1727204695.38888: checking for max_fail_percentage 44071 1727204695.38890: done checking for max_fail_percentage 44071 1727204695.38892: checking to see if all hosts have failed and the running result is not ok 44071 1727204695.38892: done checking to see if all hosts have failed 44071 1727204695.38893: getting the remaining hosts for this loop 44071 1727204695.38895: done getting the remaining hosts for this loop 44071 1727204695.38899: getting the next task for host managed-node2 44071 1727204695.38908: done getting next task for host managed-node2 44071 1727204695.38913: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204695.38918: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204695.38932: getting variables 44071 1727204695.38934: in VariableManager get_vars() 44071 1727204695.39272: Calling all_inventory to load vars for managed-node2 44071 1727204695.39277: Calling groups_inventory to load vars for managed-node2 44071 1727204695.39287: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204695.39297: Calling all_plugins_play to load vars for managed-node2 44071 1727204695.39301: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204695.39304: Calling groups_plugins_play to load vars for managed-node2 44071 1727204695.41255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204695.43651: done with get_vars() 44071 1727204695.43706: done getting variables 44071 1727204695.43779: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.084) 0:01:47.754 ***** 44071 1727204695.43824: entering _queue_task() for managed-node2/debug 44071 1727204695.44493: worker is 1 (out of 1 available) 44071 1727204695.44506: exiting _queue_task() for managed-node2/debug 44071 1727204695.44519: done queuing things up, now waiting for results queue to drain 44071 1727204695.44521: waiting for pending results... 44071 1727204695.44696: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204695.44899: in run() - task 127b8e07-fff9-c964-7471-0000000019d3 44071 1727204695.44915: variable 'ansible_search_path' from source: unknown 44071 1727204695.45083: variable 'ansible_search_path' from source: unknown 44071 1727204695.45088: calling self._execute() 44071 1727204695.45095: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204695.45103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204695.45114: variable 'omit' from source: magic vars 44071 1727204695.45604: variable 'ansible_distribution_major_version' from source: facts 44071 1727204695.45617: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204695.45781: variable 'network_state' from source: role '' defaults 44071 1727204695.45794: Evaluated conditional (network_state != {}): False 44071 1727204695.45797: when evaluation is False, skipping this task 44071 1727204695.45800: _execute() done 44071 1727204695.45803: dumping result to json 44071 1727204695.45806: done dumping result, returning 44071 1727204695.45817: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-c964-7471-0000000019d3] 44071 1727204695.45821: sending task result for task 127b8e07-fff9-c964-7471-0000000019d3 skipping: [managed-node2] => { "false_condition": "network_state != {}" } 44071 1727204695.46152: no more pending results, returning what we have 44071 1727204695.46156: results queue empty 44071 1727204695.46157: checking for any_errors_fatal 44071 1727204695.46173: done checking for any_errors_fatal 44071 1727204695.46174: checking for max_fail_percentage 44071 1727204695.46177: done checking for max_fail_percentage 44071 1727204695.46178: checking to see if all hosts have failed and the running result is not ok 44071 1727204695.46179: done checking to see if all hosts have failed 44071 1727204695.46180: getting the remaining hosts for this loop 44071 1727204695.46181: done getting the remaining hosts for this loop 44071 1727204695.46187: getting the next task for host managed-node2 44071 1727204695.46197: done getting next task for host managed-node2 44071 1727204695.46202: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204695.46208: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204695.46237: getting variables 44071 1727204695.46240: in VariableManager get_vars() 44071 1727204695.46394: Calling all_inventory to load vars for managed-node2 44071 1727204695.46398: Calling groups_inventory to load vars for managed-node2 44071 1727204695.46400: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204695.46412: Calling all_plugins_play to load vars for managed-node2 44071 1727204695.46416: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204695.46420: Calling groups_plugins_play to load vars for managed-node2 44071 1727204695.47084: done sending task result for task 127b8e07-fff9-c964-7471-0000000019d3 44071 1727204695.47090: WORKER PROCESS EXITING 44071 1727204695.48475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204695.50746: done with get_vars() 44071 1727204695.50793: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.070) 0:01:47.825 ***** 44071 1727204695.50914: entering _queue_task() for managed-node2/ping 44071 1727204695.51331: worker is 1 (out of 1 available) 44071 1727204695.51345: exiting _queue_task() for managed-node2/ping 44071 1727204695.51359: done queuing things up, now waiting for results queue to drain 44071 1727204695.51360: waiting for pending results... 44071 1727204695.51792: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204695.51988: in run() - task 127b8e07-fff9-c964-7471-0000000019d4 44071 1727204695.52168: variable 'ansible_search_path' from source: unknown 44071 1727204695.52175: variable 'ansible_search_path' from source: unknown 44071 1727204695.52179: calling self._execute() 44071 1727204695.52224: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204695.52238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204695.52253: variable 'omit' from source: magic vars 44071 1727204695.52753: variable 'ansible_distribution_major_version' from source: facts 44071 1727204695.52782: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204695.52871: variable 'omit' from source: magic vars 44071 1727204695.52892: variable 'omit' from source: magic vars 44071 1727204695.52938: variable 'omit' from source: magic vars 44071 1727204695.52999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204695.53049: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204695.53080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204695.53120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204695.53140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204695.53182: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204695.53192: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204695.53271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204695.53333: Set connection var ansible_connection to ssh 44071 1727204695.53348: Set connection var ansible_timeout to 10 44071 1727204695.53388: Set connection var ansible_pipelining to False 44071 1727204695.53391: Set connection var ansible_shell_type to sh 44071 1727204695.53394: Set connection var ansible_shell_executable to /bin/sh 44071 1727204695.53397: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204695.53430: variable 'ansible_shell_executable' from source: unknown 44071 1727204695.53439: variable 'ansible_connection' from source: unknown 44071 1727204695.53470: variable 'ansible_module_compression' from source: unknown 44071 1727204695.53473: variable 'ansible_shell_type' from source: unknown 44071 1727204695.53476: variable 'ansible_shell_executable' from source: unknown 44071 1727204695.53478: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204695.53481: variable 'ansible_pipelining' from source: unknown 44071 1727204695.53483: variable 'ansible_timeout' from source: unknown 44071 1727204695.53485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204695.53734: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204695.53823: variable 'omit' from source: magic vars 44071 1727204695.53826: starting attempt loop 44071 1727204695.53829: running the handler 44071 1727204695.53832: _low_level_execute_command(): starting 44071 1727204695.53834: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204695.54606: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204695.54712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204695.54742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204695.54757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204695.54784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204695.54902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204695.56705: stdout chunk (state=3): >>>/root <<< 44071 1727204695.56894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204695.56949: stdout chunk (state=3): >>><<< 44071 1727204695.56953: stderr chunk (state=3): >>><<< 44071 1727204695.56978: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204695.57011: _low_level_execute_command(): starting 44071 1727204695.57017: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204695.5698535-50301-280315053215018 `" && echo ansible-tmp-1727204695.5698535-50301-280315053215018="` echo /root/.ansible/tmp/ansible-tmp-1727204695.5698535-50301-280315053215018 `" ) && sleep 0' 44071 1727204695.58158: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204695.58182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204695.58241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204695.58319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204695.60373: stdout chunk (state=3): >>>ansible-tmp-1727204695.5698535-50301-280315053215018=/root/.ansible/tmp/ansible-tmp-1727204695.5698535-50301-280315053215018 <<< 44071 1727204695.60599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204695.60603: stdout chunk (state=3): >>><<< 44071 1727204695.60606: stderr chunk (state=3): >>><<< 44071 1727204695.60627: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204695.5698535-50301-280315053215018=/root/.ansible/tmp/ansible-tmp-1727204695.5698535-50301-280315053215018 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204695.60774: variable 'ansible_module_compression' from source: unknown 44071 1727204695.60777: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44071 1727204695.60784: variable 'ansible_facts' from source: unknown 44071 1727204695.60855: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204695.5698535-50301-280315053215018/AnsiballZ_ping.py 44071 1727204695.61098: Sending initial data 44071 1727204695.61101: Sent initial data (153 bytes) 44071 1727204695.61891: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204695.61997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204695.62033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204695.62055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204695.62084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204695.62210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204695.63896: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204695.63997: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204695.64091: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpjrfk7p7l /root/.ansible/tmp/ansible-tmp-1727204695.5698535-50301-280315053215018/AnsiballZ_ping.py <<< 44071 1727204695.64094: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204695.5698535-50301-280315053215018/AnsiballZ_ping.py" <<< 44071 1727204695.64146: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpjrfk7p7l" to remote "/root/.ansible/tmp/ansible-tmp-1727204695.5698535-50301-280315053215018/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204695.5698535-50301-280315053215018/AnsiballZ_ping.py" <<< 44071 1727204695.65233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204695.65237: stdout chunk (state=3): >>><<< 44071 1727204695.65240: stderr chunk (state=3): >>><<< 44071 1727204695.65242: done transferring module to remote 44071 1727204695.65245: _low_level_execute_command(): starting 44071 1727204695.65247: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204695.5698535-50301-280315053215018/ /root/.ansible/tmp/ansible-tmp-1727204695.5698535-50301-280315053215018/AnsiballZ_ping.py && sleep 0' 44071 1727204695.66021: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204695.66029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204695.66032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204695.66055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204695.66113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204695.66143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204695.66172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204695.66301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204695.68352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204695.68356: stdout chunk (state=3): >>><<< 44071 1727204695.68359: stderr chunk (state=3): >>><<< 44071 1727204695.68482: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204695.68491: _low_level_execute_command(): starting 44071 1727204695.68494: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204695.5698535-50301-280315053215018/AnsiballZ_ping.py && sleep 0' 44071 1727204695.69254: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204695.69324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204695.69359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204695.69391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204695.69502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204695.86011: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44071 1727204695.87114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204695.87132: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 44071 1727204695.87242: stderr chunk (state=3): >>><<< 44071 1727204695.87259: stdout chunk (state=3): >>><<< 44071 1727204695.87289: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204695.87322: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204695.5698535-50301-280315053215018/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204695.87338: _low_level_execute_command(): starting 44071 1727204695.87349: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204695.5698535-50301-280315053215018/ > /dev/null 2>&1 && sleep 0' 44071 1727204695.88085: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204695.88102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204695.88117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204695.88149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204695.88260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204695.88284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204695.88308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204695.88323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204695.88432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204695.90449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204695.90471: stdout chunk (state=3): >>><<< 44071 1727204695.90484: stderr chunk (state=3): >>><<< 44071 1727204695.90673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204695.90677: handler run complete 44071 1727204695.90680: attempt loop complete, returning result 44071 1727204695.90682: _execute() done 44071 1727204695.90685: dumping result to json 44071 1727204695.90687: done dumping result, returning 44071 1727204695.90689: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-c964-7471-0000000019d4] 44071 1727204695.90691: sending task result for task 127b8e07-fff9-c964-7471-0000000019d4 44071 1727204695.90770: done sending task result for task 127b8e07-fff9-c964-7471-0000000019d4 44071 1727204695.90778: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 44071 1727204695.90873: no more pending results, returning what we have 44071 1727204695.90877: results queue empty 44071 1727204695.90878: checking for any_errors_fatal 44071 1727204695.90884: done checking for any_errors_fatal 44071 1727204695.90885: checking for max_fail_percentage 44071 1727204695.90887: done checking for max_fail_percentage 44071 1727204695.90889: checking to see if all hosts have failed and the running result is not ok 44071 1727204695.90890: done checking to see if all hosts have failed 44071 1727204695.90891: getting the remaining hosts for this loop 44071 1727204695.90892: done getting the remaining hosts for this loop 44071 1727204695.90897: getting the next task for host managed-node2 44071 1727204695.90909: done getting next task for host managed-node2 44071 1727204695.90912: ^ task is: TASK: meta (role_complete) 44071 1727204695.90918: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204695.90932: getting variables 44071 1727204695.90934: in VariableManager get_vars() 44071 1727204695.91119: Calling all_inventory to load vars for managed-node2 44071 1727204695.91123: Calling groups_inventory to load vars for managed-node2 44071 1727204695.91125: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204695.91138: Calling all_plugins_play to load vars for managed-node2 44071 1727204695.91141: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204695.91144: Calling groups_plugins_play to load vars for managed-node2 44071 1727204695.94242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204696.00558: done with get_vars() 44071 1727204696.00612: done getting variables 44071 1727204696.00834: done queuing things up, now waiting for results queue to drain 44071 1727204696.00837: results queue empty 44071 1727204696.00838: checking for any_errors_fatal 44071 1727204696.00842: done checking for any_errors_fatal 44071 1727204696.00843: checking for max_fail_percentage 44071 1727204696.00844: done checking for max_fail_percentage 44071 1727204696.00845: checking to see if all hosts have failed and the running result is not ok 44071 1727204696.00846: done checking to see if all hosts have failed 44071 1727204696.00847: getting the remaining hosts for this loop 44071 1727204696.00962: done getting the remaining hosts for this loop 44071 1727204696.00967: getting the next task for host managed-node2 44071 1727204696.00978: done getting next task for host managed-node2 44071 1727204696.00981: ^ task is: TASK: Include network role 44071 1727204696.00984: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204696.00987: getting variables 44071 1727204696.00988: in VariableManager get_vars() 44071 1727204696.01008: Calling all_inventory to load vars for managed-node2 44071 1727204696.01011: Calling groups_inventory to load vars for managed-node2 44071 1727204696.01014: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204696.01020: Calling all_plugins_play to load vars for managed-node2 44071 1727204696.01023: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204696.01026: Calling groups_plugins_play to load vars for managed-node2 44071 1727204696.03427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204696.06052: done with get_vars() 44071 1727204696.06103: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.552) 0:01:48.378 ***** 44071 1727204696.06202: entering _queue_task() for managed-node2/include_role 44071 1727204696.06738: worker is 1 (out of 1 available) 44071 1727204696.06753: exiting _queue_task() for managed-node2/include_role 44071 1727204696.06770: done queuing things up, now waiting for results queue to drain 44071 1727204696.06772: waiting for pending results... 44071 1727204696.07297: running TaskExecutor() for managed-node2/TASK: Include network role 44071 1727204696.07305: in run() - task 127b8e07-fff9-c964-7471-0000000017d9 44071 1727204696.07309: variable 'ansible_search_path' from source: unknown 44071 1727204696.07312: variable 'ansible_search_path' from source: unknown 44071 1727204696.07315: calling self._execute() 44071 1727204696.07459: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204696.07463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204696.07468: variable 'omit' from source: magic vars 44071 1727204696.07954: variable 'ansible_distribution_major_version' from source: facts 44071 1727204696.07958: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204696.07962: _execute() done 44071 1727204696.07965: dumping result to json 44071 1727204696.07974: done dumping result, returning 44071 1727204696.07977: done running TaskExecutor() for managed-node2/TASK: Include network role [127b8e07-fff9-c964-7471-0000000017d9] 44071 1727204696.07980: sending task result for task 127b8e07-fff9-c964-7471-0000000017d9 44071 1727204696.08229: no more pending results, returning what we have 44071 1727204696.08234: in VariableManager get_vars() 44071 1727204696.08362: Calling all_inventory to load vars for managed-node2 44071 1727204696.08368: Calling groups_inventory to load vars for managed-node2 44071 1727204696.08372: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204696.08381: done sending task result for task 127b8e07-fff9-c964-7471-0000000017d9 44071 1727204696.08388: WORKER PROCESS EXITING 44071 1727204696.08399: Calling all_plugins_play to load vars for managed-node2 44071 1727204696.08406: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204696.08410: Calling groups_plugins_play to load vars for managed-node2 44071 1727204696.10514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204696.13056: done with get_vars() 44071 1727204696.13111: variable 'ansible_search_path' from source: unknown 44071 1727204696.13113: variable 'ansible_search_path' from source: unknown 44071 1727204696.13285: variable 'omit' from source: magic vars 44071 1727204696.13321: variable 'omit' from source: magic vars 44071 1727204696.13333: variable 'omit' from source: magic vars 44071 1727204696.13336: we have included files to process 44071 1727204696.13337: generating all_blocks data 44071 1727204696.13338: done generating all_blocks data 44071 1727204696.13343: processing included file: fedora.linux_system_roles.network 44071 1727204696.13358: in VariableManager get_vars() 44071 1727204696.13373: done with get_vars() 44071 1727204696.13395: in VariableManager get_vars() 44071 1727204696.13410: done with get_vars() 44071 1727204696.13454: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44071 1727204696.13571: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44071 1727204696.13628: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44071 1727204696.13986: in VariableManager get_vars() 44071 1727204696.14010: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204696.15802: iterating over new_blocks loaded from include file 44071 1727204696.15805: in VariableManager get_vars() 44071 1727204696.15830: done with get_vars() 44071 1727204696.15832: filtering new block on tags 44071 1727204696.16195: done filtering new block on tags 44071 1727204696.16201: in VariableManager get_vars() 44071 1727204696.16224: done with get_vars() 44071 1727204696.16225: filtering new block on tags 44071 1727204696.16246: done filtering new block on tags 44071 1727204696.16248: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 44071 1727204696.16254: extending task lists for all hosts with included blocks 44071 1727204696.16391: done extending task lists 44071 1727204696.16393: done processing included files 44071 1727204696.16394: results queue empty 44071 1727204696.16394: checking for any_errors_fatal 44071 1727204696.16396: done checking for any_errors_fatal 44071 1727204696.16397: checking for max_fail_percentage 44071 1727204696.16398: done checking for max_fail_percentage 44071 1727204696.16399: checking to see if all hosts have failed and the running result is not ok 44071 1727204696.16400: done checking to see if all hosts have failed 44071 1727204696.16400: getting the remaining hosts for this loop 44071 1727204696.16401: done getting the remaining hosts for this loop 44071 1727204696.16404: getting the next task for host managed-node2 44071 1727204696.16408: done getting next task for host managed-node2 44071 1727204696.16411: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204696.16414: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204696.16445: getting variables 44071 1727204696.16446: in VariableManager get_vars() 44071 1727204696.16463: Calling all_inventory to load vars for managed-node2 44071 1727204696.16469: Calling groups_inventory to load vars for managed-node2 44071 1727204696.16472: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204696.16480: Calling all_plugins_play to load vars for managed-node2 44071 1727204696.16482: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204696.16485: Calling groups_plugins_play to load vars for managed-node2 44071 1727204696.18718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204696.21629: done with get_vars() 44071 1727204696.21685: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.155) 0:01:48.534 ***** 44071 1727204696.21803: entering _queue_task() for managed-node2/include_tasks 44071 1727204696.22349: worker is 1 (out of 1 available) 44071 1727204696.22365: exiting _queue_task() for managed-node2/include_tasks 44071 1727204696.22388: done queuing things up, now waiting for results queue to drain 44071 1727204696.22390: waiting for pending results... 44071 1727204696.22991: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204696.23179: in run() - task 127b8e07-fff9-c964-7471-000000001b3b 44071 1727204696.23251: variable 'ansible_search_path' from source: unknown 44071 1727204696.23254: variable 'ansible_search_path' from source: unknown 44071 1727204696.23258: calling self._execute() 44071 1727204696.23447: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204696.23452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204696.23455: variable 'omit' from source: magic vars 44071 1727204696.24070: variable 'ansible_distribution_major_version' from source: facts 44071 1727204696.24074: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204696.24077: _execute() done 44071 1727204696.24079: dumping result to json 44071 1727204696.24082: done dumping result, returning 44071 1727204696.24084: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-c964-7471-000000001b3b] 44071 1727204696.24086: sending task result for task 127b8e07-fff9-c964-7471-000000001b3b 44071 1727204696.24399: no more pending results, returning what we have 44071 1727204696.24405: in VariableManager get_vars() 44071 1727204696.24463: Calling all_inventory to load vars for managed-node2 44071 1727204696.24469: Calling groups_inventory to load vars for managed-node2 44071 1727204696.24472: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204696.24481: done sending task result for task 127b8e07-fff9-c964-7471-000000001b3b 44071 1727204696.24485: WORKER PROCESS EXITING 44071 1727204696.24578: Calling all_plugins_play to load vars for managed-node2 44071 1727204696.24583: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204696.24629: Calling groups_plugins_play to load vars for managed-node2 44071 1727204696.26811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204696.29274: done with get_vars() 44071 1727204696.29316: variable 'ansible_search_path' from source: unknown 44071 1727204696.29318: variable 'ansible_search_path' from source: unknown 44071 1727204696.29369: we have included files to process 44071 1727204696.29371: generating all_blocks data 44071 1727204696.29373: done generating all_blocks data 44071 1727204696.29376: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204696.29377: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204696.29380: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204696.30162: done processing included file 44071 1727204696.30169: iterating over new_blocks loaded from include file 44071 1727204696.30171: in VariableManager get_vars() 44071 1727204696.30204: done with get_vars() 44071 1727204696.30207: filtering new block on tags 44071 1727204696.30244: done filtering new block on tags 44071 1727204696.30248: in VariableManager get_vars() 44071 1727204696.30279: done with get_vars() 44071 1727204696.30281: filtering new block on tags 44071 1727204696.30333: done filtering new block on tags 44071 1727204696.30336: in VariableManager get_vars() 44071 1727204696.30360: done with get_vars() 44071 1727204696.30362: filtering new block on tags 44071 1727204696.30413: done filtering new block on tags 44071 1727204696.30416: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 44071 1727204696.30423: extending task lists for all hosts with included blocks 44071 1727204696.32517: done extending task lists 44071 1727204696.32520: done processing included files 44071 1727204696.32521: results queue empty 44071 1727204696.32522: checking for any_errors_fatal 44071 1727204696.32524: done checking for any_errors_fatal 44071 1727204696.32525: checking for max_fail_percentage 44071 1727204696.32526: done checking for max_fail_percentage 44071 1727204696.32527: checking to see if all hosts have failed and the running result is not ok 44071 1727204696.32527: done checking to see if all hosts have failed 44071 1727204696.32528: getting the remaining hosts for this loop 44071 1727204696.32529: done getting the remaining hosts for this loop 44071 1727204696.32531: getting the next task for host managed-node2 44071 1727204696.32535: done getting next task for host managed-node2 44071 1727204696.32538: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204696.32541: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204696.32552: getting variables 44071 1727204696.32553: in VariableManager get_vars() 44071 1727204696.32570: Calling all_inventory to load vars for managed-node2 44071 1727204696.32572: Calling groups_inventory to load vars for managed-node2 44071 1727204696.32573: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204696.32578: Calling all_plugins_play to load vars for managed-node2 44071 1727204696.32580: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204696.32582: Calling groups_plugins_play to load vars for managed-node2 44071 1727204696.33503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204696.34882: done with get_vars() 44071 1727204696.34919: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.132) 0:01:48.666 ***** 44071 1727204696.35013: entering _queue_task() for managed-node2/setup 44071 1727204696.35425: worker is 1 (out of 1 available) 44071 1727204696.35440: exiting _queue_task() for managed-node2/setup 44071 1727204696.35454: done queuing things up, now waiting for results queue to drain 44071 1727204696.35457: waiting for pending results... 44071 1727204696.35744: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204696.35871: in run() - task 127b8e07-fff9-c964-7471-000000001b92 44071 1727204696.35883: variable 'ansible_search_path' from source: unknown 44071 1727204696.35888: variable 'ansible_search_path' from source: unknown 44071 1727204696.35921: calling self._execute() 44071 1727204696.36007: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204696.36012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204696.36021: variable 'omit' from source: magic vars 44071 1727204696.36349: variable 'ansible_distribution_major_version' from source: facts 44071 1727204696.36360: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204696.36541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204696.38322: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204696.38383: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204696.38418: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204696.38452: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204696.38477: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204696.38552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204696.38578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204696.38598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204696.38627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204696.38640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204696.38690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204696.38707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204696.38724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204696.38757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204696.38770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204696.38908: variable '__network_required_facts' from source: role '' defaults 44071 1727204696.38917: variable 'ansible_facts' from source: unknown 44071 1727204696.39672: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44071 1727204696.39678: when evaluation is False, skipping this task 44071 1727204696.39680: _execute() done 44071 1727204696.39683: dumping result to json 44071 1727204696.39686: done dumping result, returning 44071 1727204696.39689: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-c964-7471-000000001b92] 44071 1727204696.39695: sending task result for task 127b8e07-fff9-c964-7471-000000001b92 44071 1727204696.39796: done sending task result for task 127b8e07-fff9-c964-7471-000000001b92 44071 1727204696.39799: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204696.39858: no more pending results, returning what we have 44071 1727204696.39862: results queue empty 44071 1727204696.39863: checking for any_errors_fatal 44071 1727204696.39867: done checking for any_errors_fatal 44071 1727204696.39868: checking for max_fail_percentage 44071 1727204696.39870: done checking for max_fail_percentage 44071 1727204696.39871: checking to see if all hosts have failed and the running result is not ok 44071 1727204696.39871: done checking to see if all hosts have failed 44071 1727204696.39872: getting the remaining hosts for this loop 44071 1727204696.39874: done getting the remaining hosts for this loop 44071 1727204696.39878: getting the next task for host managed-node2 44071 1727204696.39892: done getting next task for host managed-node2 44071 1727204696.39895: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204696.39902: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204696.39930: getting variables 44071 1727204696.39932: in VariableManager get_vars() 44071 1727204696.39990: Calling all_inventory to load vars for managed-node2 44071 1727204696.39994: Calling groups_inventory to load vars for managed-node2 44071 1727204696.39996: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204696.40007: Calling all_plugins_play to load vars for managed-node2 44071 1727204696.40009: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204696.40021: Calling groups_plugins_play to load vars for managed-node2 44071 1727204696.41312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204696.43173: done with get_vars() 44071 1727204696.43213: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.083) 0:01:48.749 ***** 44071 1727204696.43323: entering _queue_task() for managed-node2/stat 44071 1727204696.43759: worker is 1 (out of 1 available) 44071 1727204696.43779: exiting _queue_task() for managed-node2/stat 44071 1727204696.43795: done queuing things up, now waiting for results queue to drain 44071 1727204696.43797: waiting for pending results... 44071 1727204696.44030: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204696.44161: in run() - task 127b8e07-fff9-c964-7471-000000001b94 44071 1727204696.44179: variable 'ansible_search_path' from source: unknown 44071 1727204696.44183: variable 'ansible_search_path' from source: unknown 44071 1727204696.44215: calling self._execute() 44071 1727204696.44304: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204696.44309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204696.44319: variable 'omit' from source: magic vars 44071 1727204696.44641: variable 'ansible_distribution_major_version' from source: facts 44071 1727204696.44653: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204696.44793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204696.45016: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204696.45059: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204696.45088: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204696.45115: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204696.45190: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204696.45208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204696.45228: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204696.45252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204696.45325: variable '__network_is_ostree' from source: set_fact 44071 1727204696.45335: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204696.45338: when evaluation is False, skipping this task 44071 1727204696.45341: _execute() done 44071 1727204696.45346: dumping result to json 44071 1727204696.45349: done dumping result, returning 44071 1727204696.45354: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-c964-7471-000000001b94] 44071 1727204696.45359: sending task result for task 127b8e07-fff9-c964-7471-000000001b94 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204696.45520: no more pending results, returning what we have 44071 1727204696.45524: results queue empty 44071 1727204696.45525: checking for any_errors_fatal 44071 1727204696.45532: done checking for any_errors_fatal 44071 1727204696.45535: checking for max_fail_percentage 44071 1727204696.45537: done checking for max_fail_percentage 44071 1727204696.45538: checking to see if all hosts have failed and the running result is not ok 44071 1727204696.45538: done checking to see if all hosts have failed 44071 1727204696.45539: getting the remaining hosts for this loop 44071 1727204696.45541: done getting the remaining hosts for this loop 44071 1727204696.45546: getting the next task for host managed-node2 44071 1727204696.45556: done getting next task for host managed-node2 44071 1727204696.45560: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204696.45566: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204696.45585: done sending task result for task 127b8e07-fff9-c964-7471-000000001b94 44071 1727204696.45588: WORKER PROCESS EXITING 44071 1727204696.45607: getting variables 44071 1727204696.45609: in VariableManager get_vars() 44071 1727204696.45657: Calling all_inventory to load vars for managed-node2 44071 1727204696.45660: Calling groups_inventory to load vars for managed-node2 44071 1727204696.45662: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204696.45675: Calling all_plugins_play to load vars for managed-node2 44071 1727204696.45685: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204696.45688: Calling groups_plugins_play to load vars for managed-node2 44071 1727204696.46799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204696.48186: done with get_vars() 44071 1727204696.48209: done getting variables 44071 1727204696.48262: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.049) 0:01:48.799 ***** 44071 1727204696.48298: entering _queue_task() for managed-node2/set_fact 44071 1727204696.48614: worker is 1 (out of 1 available) 44071 1727204696.48628: exiting _queue_task() for managed-node2/set_fact 44071 1727204696.48647: done queuing things up, now waiting for results queue to drain 44071 1727204696.48649: waiting for pending results... 44071 1727204696.48860: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204696.48992: in run() - task 127b8e07-fff9-c964-7471-000000001b95 44071 1727204696.49007: variable 'ansible_search_path' from source: unknown 44071 1727204696.49011: variable 'ansible_search_path' from source: unknown 44071 1727204696.49049: calling self._execute() 44071 1727204696.49145: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204696.49149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204696.49159: variable 'omit' from source: magic vars 44071 1727204696.49514: variable 'ansible_distribution_major_version' from source: facts 44071 1727204696.49525: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204696.49690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204696.49930: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204696.49975: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204696.50006: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204696.50033: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204696.50119: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204696.50141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204696.50161: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204696.50183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204696.50268: variable '__network_is_ostree' from source: set_fact 44071 1727204696.50276: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204696.50279: when evaluation is False, skipping this task 44071 1727204696.50281: _execute() done 44071 1727204696.50286: dumping result to json 44071 1727204696.50289: done dumping result, returning 44071 1727204696.50298: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-000000001b95] 44071 1727204696.50302: sending task result for task 127b8e07-fff9-c964-7471-000000001b95 44071 1727204696.50406: done sending task result for task 127b8e07-fff9-c964-7471-000000001b95 44071 1727204696.50409: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204696.50464: no more pending results, returning what we have 44071 1727204696.50470: results queue empty 44071 1727204696.50471: checking for any_errors_fatal 44071 1727204696.50480: done checking for any_errors_fatal 44071 1727204696.50481: checking for max_fail_percentage 44071 1727204696.50482: done checking for max_fail_percentage 44071 1727204696.50483: checking to see if all hosts have failed and the running result is not ok 44071 1727204696.50484: done checking to see if all hosts have failed 44071 1727204696.50485: getting the remaining hosts for this loop 44071 1727204696.50487: done getting the remaining hosts for this loop 44071 1727204696.50492: getting the next task for host managed-node2 44071 1727204696.50504: done getting next task for host managed-node2 44071 1727204696.50508: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204696.50515: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204696.50544: getting variables 44071 1727204696.50546: in VariableManager get_vars() 44071 1727204696.50603: Calling all_inventory to load vars for managed-node2 44071 1727204696.50606: Calling groups_inventory to load vars for managed-node2 44071 1727204696.50608: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204696.50619: Calling all_plugins_play to load vars for managed-node2 44071 1727204696.50622: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204696.50624: Calling groups_plugins_play to load vars for managed-node2 44071 1727204696.51747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204696.53017: done with get_vars() 44071 1727204696.53052: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.048) 0:01:48.847 ***** 44071 1727204696.53140: entering _queue_task() for managed-node2/service_facts 44071 1727204696.53475: worker is 1 (out of 1 available) 44071 1727204696.53491: exiting _queue_task() for managed-node2/service_facts 44071 1727204696.53508: done queuing things up, now waiting for results queue to drain 44071 1727204696.53509: waiting for pending results... 44071 1727204696.53727: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204696.53855: in run() - task 127b8e07-fff9-c964-7471-000000001b97 44071 1727204696.53870: variable 'ansible_search_path' from source: unknown 44071 1727204696.53874: variable 'ansible_search_path' from source: unknown 44071 1727204696.53907: calling self._execute() 44071 1727204696.53997: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204696.54003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204696.54012: variable 'omit' from source: magic vars 44071 1727204696.54350: variable 'ansible_distribution_major_version' from source: facts 44071 1727204696.54362: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204696.54370: variable 'omit' from source: magic vars 44071 1727204696.54450: variable 'omit' from source: magic vars 44071 1727204696.54481: variable 'omit' from source: magic vars 44071 1727204696.54523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204696.54556: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204696.54577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204696.54592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204696.54605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204696.54634: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204696.54640: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204696.54643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204696.54727: Set connection var ansible_connection to ssh 44071 1727204696.54733: Set connection var ansible_timeout to 10 44071 1727204696.54741: Set connection var ansible_pipelining to False 44071 1727204696.54747: Set connection var ansible_shell_type to sh 44071 1727204696.54752: Set connection var ansible_shell_executable to /bin/sh 44071 1727204696.54759: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204696.54783: variable 'ansible_shell_executable' from source: unknown 44071 1727204696.54786: variable 'ansible_connection' from source: unknown 44071 1727204696.54789: variable 'ansible_module_compression' from source: unknown 44071 1727204696.54792: variable 'ansible_shell_type' from source: unknown 44071 1727204696.54794: variable 'ansible_shell_executable' from source: unknown 44071 1727204696.54796: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204696.54801: variable 'ansible_pipelining' from source: unknown 44071 1727204696.54804: variable 'ansible_timeout' from source: unknown 44071 1727204696.54809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204696.54991: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204696.55001: variable 'omit' from source: magic vars 44071 1727204696.55006: starting attempt loop 44071 1727204696.55009: running the handler 44071 1727204696.55022: _low_level_execute_command(): starting 44071 1727204696.55029: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204696.55622: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204696.55628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204696.55631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204696.55673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204696.55678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204696.55696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204696.55767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204696.57534: stdout chunk (state=3): >>>/root <<< 44071 1727204696.57639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204696.57721: stderr chunk (state=3): >>><<< 44071 1727204696.57726: stdout chunk (state=3): >>><<< 44071 1727204696.57747: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204696.57767: _low_level_execute_command(): starting 44071 1727204696.57771: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204696.5774648-50440-51849575097634 `" && echo ansible-tmp-1727204696.5774648-50440-51849575097634="` echo /root/.ansible/tmp/ansible-tmp-1727204696.5774648-50440-51849575097634 `" ) && sleep 0' 44071 1727204696.58295: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204696.58299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204696.58304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204696.58314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204696.58369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204696.58373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204696.58377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204696.58449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204696.60436: stdout chunk (state=3): >>>ansible-tmp-1727204696.5774648-50440-51849575097634=/root/.ansible/tmp/ansible-tmp-1727204696.5774648-50440-51849575097634 <<< 44071 1727204696.60874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204696.60879: stdout chunk (state=3): >>><<< 44071 1727204696.60881: stderr chunk (state=3): >>><<< 44071 1727204696.60885: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204696.5774648-50440-51849575097634=/root/.ansible/tmp/ansible-tmp-1727204696.5774648-50440-51849575097634 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204696.60887: variable 'ansible_module_compression' from source: unknown 44071 1727204696.60889: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44071 1727204696.60891: variable 'ansible_facts' from source: unknown 44071 1727204696.60934: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204696.5774648-50440-51849575097634/AnsiballZ_service_facts.py 44071 1727204696.61140: Sending initial data 44071 1727204696.61151: Sent initial data (161 bytes) 44071 1727204696.61837: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204696.61889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204696.61910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204696.62010: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204696.62038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204696.62148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204696.63817: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204696.63856: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204696.63927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204696.64012: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpf_em2ttu /root/.ansible/tmp/ansible-tmp-1727204696.5774648-50440-51849575097634/AnsiballZ_service_facts.py <<< 44071 1727204696.64019: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204696.5774648-50440-51849575097634/AnsiballZ_service_facts.py" <<< 44071 1727204696.64078: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpf_em2ttu" to remote "/root/.ansible/tmp/ansible-tmp-1727204696.5774648-50440-51849575097634/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204696.5774648-50440-51849575097634/AnsiballZ_service_facts.py" <<< 44071 1727204696.65047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204696.65103: stderr chunk (state=3): >>><<< 44071 1727204696.65107: stdout chunk (state=3): >>><<< 44071 1727204696.65473: done transferring module to remote 44071 1727204696.65478: _low_level_execute_command(): starting 44071 1727204696.65480: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204696.5774648-50440-51849575097634/ /root/.ansible/tmp/ansible-tmp-1727204696.5774648-50440-51849575097634/AnsiballZ_service_facts.py && sleep 0' 44071 1727204696.65932: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204696.65945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204696.65956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204696.65974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204696.65988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204696.65995: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204696.66005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204696.66020: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204696.66029: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204696.66088: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204696.66134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204696.66149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204696.66170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204696.66398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204696.68198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204696.68211: stdout chunk (state=3): >>><<< 44071 1727204696.68231: stderr chunk (state=3): >>><<< 44071 1727204696.68335: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204696.68339: _low_level_execute_command(): starting 44071 1727204696.68341: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204696.5774648-50440-51849575097634/AnsiballZ_service_facts.py && sleep 0' 44071 1727204696.68938: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204696.68957: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204696.68976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204696.68998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204696.69016: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204696.69029: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204696.69045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204696.69087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204696.69158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204696.69182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204696.69210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204696.69311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204698.91581: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "s<<< 44071 1727204698.91640: stdout chunk (state=3): >>>topped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, <<< 44071 1727204698.91664: stdout chunk (state=3): >>>"systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.f<<< 44071 1727204698.91674: stdout chunk (state=3): >>>reedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "<<< 44071 1727204698.91686: stdout chunk (state=3): >>>status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-ro<<< 44071 1727204698.91689: stdout chunk (state=3): >>>ot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.ser<<< 44071 1727204698.91704: stdout chunk (state=3): >>>vice": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44071 1727204698.93344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204698.93348: stdout chunk (state=3): >>><<< 44071 1727204698.93351: stderr chunk (state=3): >>><<< 44071 1727204698.93382: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204698.94517: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204696.5774648-50440-51849575097634/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204698.94524: _low_level_execute_command(): starting 44071 1727204698.94530: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204696.5774648-50440-51849575097634/ > /dev/null 2>&1 && sleep 0' 44071 1727204698.95023: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204698.95027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204698.95029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204698.95032: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204698.95037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204698.95089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204698.95098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204698.95173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204698.97199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204698.97203: stdout chunk (state=3): >>><<< 44071 1727204698.97206: stderr chunk (state=3): >>><<< 44071 1727204698.97223: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204698.97278: handler run complete 44071 1727204698.97540: variable 'ansible_facts' from source: unknown 44071 1727204698.97678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204698.98045: variable 'ansible_facts' from source: unknown 44071 1727204698.98143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204698.98301: attempt loop complete, returning result 44071 1727204698.98307: _execute() done 44071 1727204698.98309: dumping result to json 44071 1727204698.98354: done dumping result, returning 44071 1727204698.98365: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-c964-7471-000000001b97] 44071 1727204698.98370: sending task result for task 127b8e07-fff9-c964-7471-000000001b97 44071 1727204698.99275: done sending task result for task 127b8e07-fff9-c964-7471-000000001b97 44071 1727204698.99279: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204698.99363: no more pending results, returning what we have 44071 1727204698.99367: results queue empty 44071 1727204698.99368: checking for any_errors_fatal 44071 1727204698.99371: done checking for any_errors_fatal 44071 1727204698.99371: checking for max_fail_percentage 44071 1727204698.99372: done checking for max_fail_percentage 44071 1727204698.99373: checking to see if all hosts have failed and the running result is not ok 44071 1727204698.99373: done checking to see if all hosts have failed 44071 1727204698.99374: getting the remaining hosts for this loop 44071 1727204698.99375: done getting the remaining hosts for this loop 44071 1727204698.99378: getting the next task for host managed-node2 44071 1727204698.99382: done getting next task for host managed-node2 44071 1727204698.99389: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204698.99394: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204698.99407: getting variables 44071 1727204698.99409: in VariableManager get_vars() 44071 1727204698.99446: Calling all_inventory to load vars for managed-node2 44071 1727204698.99448: Calling groups_inventory to load vars for managed-node2 44071 1727204698.99450: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204698.99458: Calling all_plugins_play to load vars for managed-node2 44071 1727204698.99463: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204698.99468: Calling groups_plugins_play to load vars for managed-node2 44071 1727204699.00946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204699.02442: done with get_vars() 44071 1727204699.02474: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:04:59 -0400 (0:00:02.494) 0:01:51.342 ***** 44071 1727204699.02559: entering _queue_task() for managed-node2/package_facts 44071 1727204699.02869: worker is 1 (out of 1 available) 44071 1727204699.02885: exiting _queue_task() for managed-node2/package_facts 44071 1727204699.02901: done queuing things up, now waiting for results queue to drain 44071 1727204699.02903: waiting for pending results... 44071 1727204699.03108: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204699.03220: in run() - task 127b8e07-fff9-c964-7471-000000001b98 44071 1727204699.03242: variable 'ansible_search_path' from source: unknown 44071 1727204699.03248: variable 'ansible_search_path' from source: unknown 44071 1727204699.03281: calling self._execute() 44071 1727204699.03371: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204699.03378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204699.03387: variable 'omit' from source: magic vars 44071 1727204699.03887: variable 'ansible_distribution_major_version' from source: facts 44071 1727204699.03892: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204699.03897: variable 'omit' from source: magic vars 44071 1727204699.03960: variable 'omit' from source: magic vars 44071 1727204699.04017: variable 'omit' from source: magic vars 44071 1727204699.04104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204699.04126: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204699.04156: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204699.04182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204699.04213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204699.04322: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204699.04326: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204699.04328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204699.04400: Set connection var ansible_connection to ssh 44071 1727204699.04413: Set connection var ansible_timeout to 10 44071 1727204699.04436: Set connection var ansible_pipelining to False 44071 1727204699.04451: Set connection var ansible_shell_type to sh 44071 1727204699.04464: Set connection var ansible_shell_executable to /bin/sh 44071 1727204699.04481: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204699.04512: variable 'ansible_shell_executable' from source: unknown 44071 1727204699.04521: variable 'ansible_connection' from source: unknown 44071 1727204699.04528: variable 'ansible_module_compression' from source: unknown 44071 1727204699.04545: variable 'ansible_shell_type' from source: unknown 44071 1727204699.04552: variable 'ansible_shell_executable' from source: unknown 44071 1727204699.04650: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204699.04654: variable 'ansible_pipelining' from source: unknown 44071 1727204699.04657: variable 'ansible_timeout' from source: unknown 44071 1727204699.04660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204699.04822: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204699.04831: variable 'omit' from source: magic vars 44071 1727204699.04838: starting attempt loop 44071 1727204699.04841: running the handler 44071 1727204699.04855: _low_level_execute_command(): starting 44071 1727204699.04867: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204699.05437: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204699.05442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204699.05447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204699.05501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204699.05504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204699.05508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204699.05584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204699.07272: stdout chunk (state=3): >>>/root <<< 44071 1727204699.07374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204699.07449: stderr chunk (state=3): >>><<< 44071 1727204699.07452: stdout chunk (state=3): >>><<< 44071 1727204699.07473: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204699.07486: _low_level_execute_command(): starting 44071 1727204699.07497: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204699.0746922-50579-144793548505183 `" && echo ansible-tmp-1727204699.0746922-50579-144793548505183="` echo /root/.ansible/tmp/ansible-tmp-1727204699.0746922-50579-144793548505183 `" ) && sleep 0' 44071 1727204699.08008: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204699.08012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204699.08016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204699.08028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204699.08032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204699.08080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204699.08084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204699.08088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204699.08161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204699.10116: stdout chunk (state=3): >>>ansible-tmp-1727204699.0746922-50579-144793548505183=/root/.ansible/tmp/ansible-tmp-1727204699.0746922-50579-144793548505183 <<< 44071 1727204699.10230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204699.10296: stderr chunk (state=3): >>><<< 44071 1727204699.10299: stdout chunk (state=3): >>><<< 44071 1727204699.10316: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204699.0746922-50579-144793548505183=/root/.ansible/tmp/ansible-tmp-1727204699.0746922-50579-144793548505183 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204699.10363: variable 'ansible_module_compression' from source: unknown 44071 1727204699.10405: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44071 1727204699.10464: variable 'ansible_facts' from source: unknown 44071 1727204699.10590: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204699.0746922-50579-144793548505183/AnsiballZ_package_facts.py 44071 1727204699.10712: Sending initial data 44071 1727204699.10716: Sent initial data (162 bytes) 44071 1727204699.11223: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204699.11227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204699.11229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204699.11232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204699.11291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204699.11297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204699.11300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204699.11375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204699.12972: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204699.13042: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204699.13110: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp6c_jm2k7 /root/.ansible/tmp/ansible-tmp-1727204699.0746922-50579-144793548505183/AnsiballZ_package_facts.py <<< 44071 1727204699.13113: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204699.0746922-50579-144793548505183/AnsiballZ_package_facts.py" <<< 44071 1727204699.13181: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp6c_jm2k7" to remote "/root/.ansible/tmp/ansible-tmp-1727204699.0746922-50579-144793548505183/AnsiballZ_package_facts.py" <<< 44071 1727204699.13185: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204699.0746922-50579-144793548505183/AnsiballZ_package_facts.py" <<< 44071 1727204699.14510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204699.14585: stderr chunk (state=3): >>><<< 44071 1727204699.14589: stdout chunk (state=3): >>><<< 44071 1727204699.14608: done transferring module to remote 44071 1727204699.14623: _low_level_execute_command(): starting 44071 1727204699.14626: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204699.0746922-50579-144793548505183/ /root/.ansible/tmp/ansible-tmp-1727204699.0746922-50579-144793548505183/AnsiballZ_package_facts.py && sleep 0' 44071 1727204699.15134: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204699.15138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204699.15141: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204699.15143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204699.15196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204699.15200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204699.15286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204699.17274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204699.17279: stdout chunk (state=3): >>><<< 44071 1727204699.17281: stderr chunk (state=3): >>><<< 44071 1727204699.17284: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204699.17287: _low_level_execute_command(): starting 44071 1727204699.17289: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204699.0746922-50579-144793548505183/AnsiballZ_package_facts.py && sleep 0' 44071 1727204699.17926: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204699.17941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204699.17953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204699.17970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204699.18037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204699.18095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204699.18108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204699.18137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204699.18247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204699.81442: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 44071 1727204699.81518: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version"<<< 44071 1727204699.81543: stdout chunk (state=3): >>>: "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44071 1727204699.83226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204699.83300: stderr chunk (state=3): >>><<< 44071 1727204699.83303: stdout chunk (state=3): >>><<< 44071 1727204699.83335: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204699.95491: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204699.0746922-50579-144793548505183/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204699.95496: _low_level_execute_command(): starting 44071 1727204699.95498: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204699.0746922-50579-144793548505183/ > /dev/null 2>&1 && sleep 0' 44071 1727204699.96154: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204699.96186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204699.96200: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204699.96208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204699.96215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204699.96282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204699.96289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204699.96364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204699.98391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204699.98433: stderr chunk (state=3): >>><<< 44071 1727204699.98572: stdout chunk (state=3): >>><<< 44071 1727204699.98576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204699.98579: handler run complete 44071 1727204699.99414: variable 'ansible_facts' from source: unknown 44071 1727204699.99822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204700.01314: variable 'ansible_facts' from source: unknown 44071 1727204700.01648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204700.02210: attempt loop complete, returning result 44071 1727204700.02225: _execute() done 44071 1727204700.02228: dumping result to json 44071 1727204700.02389: done dumping result, returning 44071 1727204700.02396: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-c964-7471-000000001b98] 44071 1727204700.02399: sending task result for task 127b8e07-fff9-c964-7471-000000001b98 44071 1727204700.09894: done sending task result for task 127b8e07-fff9-c964-7471-000000001b98 44071 1727204700.09898: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204700.09958: no more pending results, returning what we have 44071 1727204700.09961: results queue empty 44071 1727204700.09961: checking for any_errors_fatal 44071 1727204700.09964: done checking for any_errors_fatal 44071 1727204700.09966: checking for max_fail_percentage 44071 1727204700.09967: done checking for max_fail_percentage 44071 1727204700.09968: checking to see if all hosts have failed and the running result is not ok 44071 1727204700.09969: done checking to see if all hosts have failed 44071 1727204700.09970: getting the remaining hosts for this loop 44071 1727204700.09971: done getting the remaining hosts for this loop 44071 1727204700.09973: getting the next task for host managed-node2 44071 1727204700.09978: done getting next task for host managed-node2 44071 1727204700.09980: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204700.09988: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204700.09996: getting variables 44071 1727204700.09997: in VariableManager get_vars() 44071 1727204700.10013: Calling all_inventory to load vars for managed-node2 44071 1727204700.10016: Calling groups_inventory to load vars for managed-node2 44071 1727204700.10018: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204700.10023: Calling all_plugins_play to load vars for managed-node2 44071 1727204700.10025: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204700.10026: Calling groups_plugins_play to load vars for managed-node2 44071 1727204700.10911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204700.12121: done with get_vars() 44071 1727204700.12152: done getting variables 44071 1727204700.12201: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:05:00 -0400 (0:00:01.096) 0:01:52.438 ***** 44071 1727204700.12225: entering _queue_task() for managed-node2/debug 44071 1727204700.12527: worker is 1 (out of 1 available) 44071 1727204700.12543: exiting _queue_task() for managed-node2/debug 44071 1727204700.12557: done queuing things up, now waiting for results queue to drain 44071 1727204700.12561: waiting for pending results... 44071 1727204700.12775: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204700.12906: in run() - task 127b8e07-fff9-c964-7471-000000001b3c 44071 1727204700.12915: variable 'ansible_search_path' from source: unknown 44071 1727204700.12919: variable 'ansible_search_path' from source: unknown 44071 1727204700.12953: calling self._execute() 44071 1727204700.13043: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204700.13049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204700.13058: variable 'omit' from source: magic vars 44071 1727204700.13390: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.13402: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204700.13408: variable 'omit' from source: magic vars 44071 1727204700.13464: variable 'omit' from source: magic vars 44071 1727204700.13546: variable 'network_provider' from source: set_fact 44071 1727204700.13564: variable 'omit' from source: magic vars 44071 1727204700.13603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204700.13635: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204700.13655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204700.13676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204700.13687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204700.13712: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204700.13716: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204700.13719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204700.13800: Set connection var ansible_connection to ssh 44071 1727204700.13807: Set connection var ansible_timeout to 10 44071 1727204700.13813: Set connection var ansible_pipelining to False 44071 1727204700.13818: Set connection var ansible_shell_type to sh 44071 1727204700.13824: Set connection var ansible_shell_executable to /bin/sh 44071 1727204700.13831: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204700.13852: variable 'ansible_shell_executable' from source: unknown 44071 1727204700.13855: variable 'ansible_connection' from source: unknown 44071 1727204700.13858: variable 'ansible_module_compression' from source: unknown 44071 1727204700.13860: variable 'ansible_shell_type' from source: unknown 44071 1727204700.13863: variable 'ansible_shell_executable' from source: unknown 44071 1727204700.13867: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204700.13871: variable 'ansible_pipelining' from source: unknown 44071 1727204700.13874: variable 'ansible_timeout' from source: unknown 44071 1727204700.13884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204700.13998: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204700.14008: variable 'omit' from source: magic vars 44071 1727204700.14012: starting attempt loop 44071 1727204700.14016: running the handler 44071 1727204700.14060: handler run complete 44071 1727204700.14074: attempt loop complete, returning result 44071 1727204700.14077: _execute() done 44071 1727204700.14079: dumping result to json 44071 1727204700.14082: done dumping result, returning 44071 1727204700.14090: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-c964-7471-000000001b3c] 44071 1727204700.14100: sending task result for task 127b8e07-fff9-c964-7471-000000001b3c 44071 1727204700.14193: done sending task result for task 127b8e07-fff9-c964-7471-000000001b3c 44071 1727204700.14196: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 44071 1727204700.14282: no more pending results, returning what we have 44071 1727204700.14287: results queue empty 44071 1727204700.14288: checking for any_errors_fatal 44071 1727204700.14302: done checking for any_errors_fatal 44071 1727204700.14303: checking for max_fail_percentage 44071 1727204700.14304: done checking for max_fail_percentage 44071 1727204700.14305: checking to see if all hosts have failed and the running result is not ok 44071 1727204700.14306: done checking to see if all hosts have failed 44071 1727204700.14307: getting the remaining hosts for this loop 44071 1727204700.14308: done getting the remaining hosts for this loop 44071 1727204700.14313: getting the next task for host managed-node2 44071 1727204700.14321: done getting next task for host managed-node2 44071 1727204700.14325: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204700.14330: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204700.14343: getting variables 44071 1727204700.14344: in VariableManager get_vars() 44071 1727204700.14397: Calling all_inventory to load vars for managed-node2 44071 1727204700.14400: Calling groups_inventory to load vars for managed-node2 44071 1727204700.14403: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204700.14413: Calling all_plugins_play to load vars for managed-node2 44071 1727204700.14417: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204700.14419: Calling groups_plugins_play to load vars for managed-node2 44071 1727204700.15534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204700.16755: done with get_vars() 44071 1727204700.16787: done getting variables 44071 1727204700.16838: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:05:00 -0400 (0:00:00.046) 0:01:52.485 ***** 44071 1727204700.16878: entering _queue_task() for managed-node2/fail 44071 1727204700.17174: worker is 1 (out of 1 available) 44071 1727204700.17189: exiting _queue_task() for managed-node2/fail 44071 1727204700.17203: done queuing things up, now waiting for results queue to drain 44071 1727204700.17205: waiting for pending results... 44071 1727204700.17428: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204700.17555: in run() - task 127b8e07-fff9-c964-7471-000000001b3d 44071 1727204700.17563: variable 'ansible_search_path' from source: unknown 44071 1727204700.17571: variable 'ansible_search_path' from source: unknown 44071 1727204700.17604: calling self._execute() 44071 1727204700.17696: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204700.17701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204700.17710: variable 'omit' from source: magic vars 44071 1727204700.18051: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.18064: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204700.18162: variable 'network_state' from source: role '' defaults 44071 1727204700.18174: Evaluated conditional (network_state != {}): False 44071 1727204700.18177: when evaluation is False, skipping this task 44071 1727204700.18180: _execute() done 44071 1727204700.18183: dumping result to json 44071 1727204700.18185: done dumping result, returning 44071 1727204700.18196: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-c964-7471-000000001b3d] 44071 1727204700.18199: sending task result for task 127b8e07-fff9-c964-7471-000000001b3d 44071 1727204700.18305: done sending task result for task 127b8e07-fff9-c964-7471-000000001b3d 44071 1727204700.18311: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204700.18374: no more pending results, returning what we have 44071 1727204700.18378: results queue empty 44071 1727204700.18379: checking for any_errors_fatal 44071 1727204700.18389: done checking for any_errors_fatal 44071 1727204700.18389: checking for max_fail_percentage 44071 1727204700.18391: done checking for max_fail_percentage 44071 1727204700.18392: checking to see if all hosts have failed and the running result is not ok 44071 1727204700.18393: done checking to see if all hosts have failed 44071 1727204700.18393: getting the remaining hosts for this loop 44071 1727204700.18395: done getting the remaining hosts for this loop 44071 1727204700.18400: getting the next task for host managed-node2 44071 1727204700.18411: done getting next task for host managed-node2 44071 1727204700.18416: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204700.18424: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204700.18452: getting variables 44071 1727204700.18454: in VariableManager get_vars() 44071 1727204700.18499: Calling all_inventory to load vars for managed-node2 44071 1727204700.18503: Calling groups_inventory to load vars for managed-node2 44071 1727204700.18505: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204700.18516: Calling all_plugins_play to load vars for managed-node2 44071 1727204700.18519: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204700.18521: Calling groups_plugins_play to load vars for managed-node2 44071 1727204700.19694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204700.20941: done with get_vars() 44071 1727204700.20976: done getting variables 44071 1727204700.21025: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:05:00 -0400 (0:00:00.041) 0:01:52.527 ***** 44071 1727204700.21055: entering _queue_task() for managed-node2/fail 44071 1727204700.21356: worker is 1 (out of 1 available) 44071 1727204700.21373: exiting _queue_task() for managed-node2/fail 44071 1727204700.21388: done queuing things up, now waiting for results queue to drain 44071 1727204700.21389: waiting for pending results... 44071 1727204700.21612: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204700.21721: in run() - task 127b8e07-fff9-c964-7471-000000001b3e 44071 1727204700.21736: variable 'ansible_search_path' from source: unknown 44071 1727204700.21741: variable 'ansible_search_path' from source: unknown 44071 1727204700.21779: calling self._execute() 44071 1727204700.21872: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204700.21883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204700.21925: variable 'omit' from source: magic vars 44071 1727204700.22472: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.22476: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204700.22553: variable 'network_state' from source: role '' defaults 44071 1727204700.22594: Evaluated conditional (network_state != {}): False 44071 1727204700.22772: when evaluation is False, skipping this task 44071 1727204700.22776: _execute() done 44071 1727204700.22779: dumping result to json 44071 1727204700.22781: done dumping result, returning 44071 1727204700.22785: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-c964-7471-000000001b3e] 44071 1727204700.22789: sending task result for task 127b8e07-fff9-c964-7471-000000001b3e 44071 1727204700.23372: done sending task result for task 127b8e07-fff9-c964-7471-000000001b3e 44071 1727204700.23377: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204700.23437: no more pending results, returning what we have 44071 1727204700.23441: results queue empty 44071 1727204700.23442: checking for any_errors_fatal 44071 1727204700.23449: done checking for any_errors_fatal 44071 1727204700.23450: checking for max_fail_percentage 44071 1727204700.23452: done checking for max_fail_percentage 44071 1727204700.23453: checking to see if all hosts have failed and the running result is not ok 44071 1727204700.23454: done checking to see if all hosts have failed 44071 1727204700.23455: getting the remaining hosts for this loop 44071 1727204700.23456: done getting the remaining hosts for this loop 44071 1727204700.23461: getting the next task for host managed-node2 44071 1727204700.23474: done getting next task for host managed-node2 44071 1727204700.23479: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204700.23486: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204700.23515: getting variables 44071 1727204700.23517: in VariableManager get_vars() 44071 1727204700.23596: Calling all_inventory to load vars for managed-node2 44071 1727204700.23600: Calling groups_inventory to load vars for managed-node2 44071 1727204700.23604: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204700.23620: Calling all_plugins_play to load vars for managed-node2 44071 1727204700.23624: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204700.23628: Calling groups_plugins_play to load vars for managed-node2 44071 1727204700.25811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204700.29541: done with get_vars() 44071 1727204700.29588: done getting variables 44071 1727204700.29661: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:05:00 -0400 (0:00:00.086) 0:01:52.613 ***** 44071 1727204700.29710: entering _queue_task() for managed-node2/fail 44071 1727204700.30130: worker is 1 (out of 1 available) 44071 1727204700.30144: exiting _queue_task() for managed-node2/fail 44071 1727204700.30162: done queuing things up, now waiting for results queue to drain 44071 1727204700.30164: waiting for pending results... 44071 1727204700.30504: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204700.30686: in run() - task 127b8e07-fff9-c964-7471-000000001b3f 44071 1727204700.30715: variable 'ansible_search_path' from source: unknown 44071 1727204700.30723: variable 'ansible_search_path' from source: unknown 44071 1727204700.30772: calling self._execute() 44071 1727204700.30899: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204700.30917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204700.30935: variable 'omit' from source: magic vars 44071 1727204700.31393: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.31415: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204700.31632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204700.34875: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204700.34913: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204700.34959: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204700.35011: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204700.35104: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204700.35145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.35185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.35226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.35276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.35296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.35418: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.35448: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44071 1727204700.35593: variable 'ansible_distribution' from source: facts 44071 1727204700.35646: variable '__network_rh_distros' from source: role '' defaults 44071 1727204700.35649: Evaluated conditional (ansible_distribution in __network_rh_distros): False 44071 1727204700.35652: when evaluation is False, skipping this task 44071 1727204700.35654: _execute() done 44071 1727204700.35656: dumping result to json 44071 1727204700.35659: done dumping result, returning 44071 1727204700.35661: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-c964-7471-000000001b3f] 44071 1727204700.35664: sending task result for task 127b8e07-fff9-c964-7471-000000001b3f 44071 1727204700.35833: done sending task result for task 127b8e07-fff9-c964-7471-000000001b3f 44071 1727204700.35837: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 44071 1727204700.35891: no more pending results, returning what we have 44071 1727204700.35895: results queue empty 44071 1727204700.35896: checking for any_errors_fatal 44071 1727204700.35903: done checking for any_errors_fatal 44071 1727204700.35903: checking for max_fail_percentage 44071 1727204700.35905: done checking for max_fail_percentage 44071 1727204700.35906: checking to see if all hosts have failed and the running result is not ok 44071 1727204700.35907: done checking to see if all hosts have failed 44071 1727204700.35908: getting the remaining hosts for this loop 44071 1727204700.35910: done getting the remaining hosts for this loop 44071 1727204700.35914: getting the next task for host managed-node2 44071 1727204700.35924: done getting next task for host managed-node2 44071 1727204700.35928: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204700.35934: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204700.35967: getting variables 44071 1727204700.35969: in VariableManager get_vars() 44071 1727204700.36018: Calling all_inventory to load vars for managed-node2 44071 1727204700.36021: Calling groups_inventory to load vars for managed-node2 44071 1727204700.36024: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204700.36036: Calling all_plugins_play to load vars for managed-node2 44071 1727204700.36040: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204700.36043: Calling groups_plugins_play to load vars for managed-node2 44071 1727204700.38397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204700.40820: done with get_vars() 44071 1727204700.40875: done getting variables 44071 1727204700.40948: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:05:00 -0400 (0:00:00.112) 0:01:52.726 ***** 44071 1727204700.40993: entering _queue_task() for managed-node2/dnf 44071 1727204700.41439: worker is 1 (out of 1 available) 44071 1727204700.41457: exiting _queue_task() for managed-node2/dnf 44071 1727204700.41474: done queuing things up, now waiting for results queue to drain 44071 1727204700.41476: waiting for pending results... 44071 1727204700.41809: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204700.41958: in run() - task 127b8e07-fff9-c964-7471-000000001b40 44071 1727204700.41972: variable 'ansible_search_path' from source: unknown 44071 1727204700.41976: variable 'ansible_search_path' from source: unknown 44071 1727204700.42013: calling self._execute() 44071 1727204700.42111: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204700.42116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204700.42125: variable 'omit' from source: magic vars 44071 1727204700.42458: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.42471: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204700.42640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204700.44728: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204700.44898: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204700.44903: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204700.44905: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204700.44908: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204700.44997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.45044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.45075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.45116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.45143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.45243: variable 'ansible_distribution' from source: facts 44071 1727204700.45248: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.45255: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44071 1727204700.45351: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204700.45450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.45472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.45490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.45518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.45531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.45568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.45588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.45606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.45637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.45647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.45682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.45698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.45716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.45743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.45755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.45876: variable 'network_connections' from source: include params 44071 1727204700.45889: variable 'interface' from source: play vars 44071 1727204700.45939: variable 'interface' from source: play vars 44071 1727204700.46001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204700.46132: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204700.46165: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204700.46191: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204700.46216: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204700.46252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204700.46270: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204700.46292: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.46316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204700.46358: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204700.46702: variable 'network_connections' from source: include params 44071 1727204700.46706: variable 'interface' from source: play vars 44071 1727204700.46759: variable 'interface' from source: play vars 44071 1727204700.46780: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204700.46783: when evaluation is False, skipping this task 44071 1727204700.46786: _execute() done 44071 1727204700.46788: dumping result to json 44071 1727204700.46793: done dumping result, returning 44071 1727204700.46800: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001b40] 44071 1727204700.46805: sending task result for task 127b8e07-fff9-c964-7471-000000001b40 44071 1727204700.46914: done sending task result for task 127b8e07-fff9-c964-7471-000000001b40 44071 1727204700.46917: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204700.46979: no more pending results, returning what we have 44071 1727204700.46983: results queue empty 44071 1727204700.46983: checking for any_errors_fatal 44071 1727204700.46990: done checking for any_errors_fatal 44071 1727204700.46990: checking for max_fail_percentage 44071 1727204700.46992: done checking for max_fail_percentage 44071 1727204700.46993: checking to see if all hosts have failed and the running result is not ok 44071 1727204700.46994: done checking to see if all hosts have failed 44071 1727204700.46995: getting the remaining hosts for this loop 44071 1727204700.46996: done getting the remaining hosts for this loop 44071 1727204700.47001: getting the next task for host managed-node2 44071 1727204700.47010: done getting next task for host managed-node2 44071 1727204700.47014: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204700.47019: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204700.47057: getting variables 44071 1727204700.47059: in VariableManager get_vars() 44071 1727204700.47104: Calling all_inventory to load vars for managed-node2 44071 1727204700.47107: Calling groups_inventory to load vars for managed-node2 44071 1727204700.47109: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204700.47120: Calling all_plugins_play to load vars for managed-node2 44071 1727204700.47123: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204700.47126: Calling groups_plugins_play to load vars for managed-node2 44071 1727204700.48985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204700.50261: done with get_vars() 44071 1727204700.50293: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204700.50360: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:05:00 -0400 (0:00:00.093) 0:01:52.820 ***** 44071 1727204700.50389: entering _queue_task() for managed-node2/yum 44071 1727204700.50697: worker is 1 (out of 1 available) 44071 1727204700.50713: exiting _queue_task() for managed-node2/yum 44071 1727204700.50728: done queuing things up, now waiting for results queue to drain 44071 1727204700.50730: waiting for pending results... 44071 1727204700.50963: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204700.51135: in run() - task 127b8e07-fff9-c964-7471-000000001b41 44071 1727204700.51140: variable 'ansible_search_path' from source: unknown 44071 1727204700.51143: variable 'ansible_search_path' from source: unknown 44071 1727204700.51190: calling self._execute() 44071 1727204700.51476: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204700.51480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204700.51483: variable 'omit' from source: magic vars 44071 1727204700.51745: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.51758: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204700.51942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204700.54099: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204700.54157: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204700.54189: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204700.54220: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204700.54244: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204700.54312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.54350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.54370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.54401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.54412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.54502: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.54516: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44071 1727204700.54519: when evaluation is False, skipping this task 44071 1727204700.54523: _execute() done 44071 1727204700.54525: dumping result to json 44071 1727204700.54528: done dumping result, returning 44071 1727204700.54540: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001b41] 44071 1727204700.54543: sending task result for task 127b8e07-fff9-c964-7471-000000001b41 44071 1727204700.54654: done sending task result for task 127b8e07-fff9-c964-7471-000000001b41 44071 1727204700.54657: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44071 1727204700.54728: no more pending results, returning what we have 44071 1727204700.54732: results queue empty 44071 1727204700.54735: checking for any_errors_fatal 44071 1727204700.54745: done checking for any_errors_fatal 44071 1727204700.54745: checking for max_fail_percentage 44071 1727204700.54747: done checking for max_fail_percentage 44071 1727204700.54748: checking to see if all hosts have failed and the running result is not ok 44071 1727204700.54749: done checking to see if all hosts have failed 44071 1727204700.54749: getting the remaining hosts for this loop 44071 1727204700.54751: done getting the remaining hosts for this loop 44071 1727204700.54756: getting the next task for host managed-node2 44071 1727204700.54767: done getting next task for host managed-node2 44071 1727204700.54772: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204700.54777: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204700.54805: getting variables 44071 1727204700.54806: in VariableManager get_vars() 44071 1727204700.54853: Calling all_inventory to load vars for managed-node2 44071 1727204700.54856: Calling groups_inventory to load vars for managed-node2 44071 1727204700.54858: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204700.54876: Calling all_plugins_play to load vars for managed-node2 44071 1727204700.54879: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204700.54882: Calling groups_plugins_play to load vars for managed-node2 44071 1727204700.56485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204700.57784: done with get_vars() 44071 1727204700.57817: done getting variables 44071 1727204700.57875: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:05:00 -0400 (0:00:00.075) 0:01:52.895 ***** 44071 1727204700.57905: entering _queue_task() for managed-node2/fail 44071 1727204700.58212: worker is 1 (out of 1 available) 44071 1727204700.58229: exiting _queue_task() for managed-node2/fail 44071 1727204700.58244: done queuing things up, now waiting for results queue to drain 44071 1727204700.58246: waiting for pending results... 44071 1727204700.58461: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204700.58583: in run() - task 127b8e07-fff9-c964-7471-000000001b42 44071 1727204700.58598: variable 'ansible_search_path' from source: unknown 44071 1727204700.58602: variable 'ansible_search_path' from source: unknown 44071 1727204700.58635: calling self._execute() 44071 1727204700.58724: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204700.58730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204700.58743: variable 'omit' from source: magic vars 44071 1727204700.59066: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.59078: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204700.59175: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204700.59327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204700.61393: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204700.61451: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204700.61482: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204700.61509: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204700.61536: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204700.61601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.61624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.61650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.61681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.61692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.61730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.61752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.61774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.61800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.61812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.61845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.61865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.61885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.61911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.61922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.62057: variable 'network_connections' from source: include params 44071 1727204700.62071: variable 'interface' from source: play vars 44071 1727204700.62128: variable 'interface' from source: play vars 44071 1727204700.62190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204700.62333: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204700.62364: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204700.62391: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204700.62417: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204700.62455: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204700.62473: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204700.62492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.62511: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204700.62558: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204700.62730: variable 'network_connections' from source: include params 44071 1727204700.62741: variable 'interface' from source: play vars 44071 1727204700.62789: variable 'interface' from source: play vars 44071 1727204700.62809: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204700.62813: when evaluation is False, skipping this task 44071 1727204700.62817: _execute() done 44071 1727204700.62819: dumping result to json 44071 1727204700.62822: done dumping result, returning 44071 1727204700.62831: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001b42] 44071 1727204700.62838: sending task result for task 127b8e07-fff9-c964-7471-000000001b42 44071 1727204700.62941: done sending task result for task 127b8e07-fff9-c964-7471-000000001b42 44071 1727204700.62944: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204700.63000: no more pending results, returning what we have 44071 1727204700.63004: results queue empty 44071 1727204700.63005: checking for any_errors_fatal 44071 1727204700.63011: done checking for any_errors_fatal 44071 1727204700.63012: checking for max_fail_percentage 44071 1727204700.63013: done checking for max_fail_percentage 44071 1727204700.63014: checking to see if all hosts have failed and the running result is not ok 44071 1727204700.63015: done checking to see if all hosts have failed 44071 1727204700.63015: getting the remaining hosts for this loop 44071 1727204700.63017: done getting the remaining hosts for this loop 44071 1727204700.63022: getting the next task for host managed-node2 44071 1727204700.63032: done getting next task for host managed-node2 44071 1727204700.63036: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44071 1727204700.63042: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204700.63073: getting variables 44071 1727204700.63074: in VariableManager get_vars() 44071 1727204700.63121: Calling all_inventory to load vars for managed-node2 44071 1727204700.63124: Calling groups_inventory to load vars for managed-node2 44071 1727204700.63126: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204700.63136: Calling all_plugins_play to load vars for managed-node2 44071 1727204700.63139: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204700.63142: Calling groups_plugins_play to load vars for managed-node2 44071 1727204700.64348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204700.65560: done with get_vars() 44071 1727204700.65594: done getting variables 44071 1727204700.65647: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:05:00 -0400 (0:00:00.077) 0:01:52.973 ***** 44071 1727204700.65680: entering _queue_task() for managed-node2/package 44071 1727204700.65988: worker is 1 (out of 1 available) 44071 1727204700.66003: exiting _queue_task() for managed-node2/package 44071 1727204700.66021: done queuing things up, now waiting for results queue to drain 44071 1727204700.66023: waiting for pending results... 44071 1727204700.66245: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 44071 1727204700.66371: in run() - task 127b8e07-fff9-c964-7471-000000001b43 44071 1727204700.66383: variable 'ansible_search_path' from source: unknown 44071 1727204700.66386: variable 'ansible_search_path' from source: unknown 44071 1727204700.66423: calling self._execute() 44071 1727204700.66511: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204700.66515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204700.66525: variable 'omit' from source: magic vars 44071 1727204700.66854: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.66867: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204700.67035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204700.67258: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204700.67299: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204700.67325: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204700.67401: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204700.67500: variable 'network_packages' from source: role '' defaults 44071 1727204700.67589: variable '__network_provider_setup' from source: role '' defaults 44071 1727204700.67600: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204700.67651: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204700.67659: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204700.67709: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204700.67846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204700.69396: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204700.69448: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204700.69479: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204700.69505: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204700.69526: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204700.69599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.69621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.69646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.69677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.69688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.69723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.69743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.69762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.69793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.69804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.69977: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204700.70063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.70085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.70103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.70130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.70142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.70218: variable 'ansible_python' from source: facts 44071 1727204700.70236: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204700.70301: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204700.70362: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204700.70460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.70480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.70498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.70529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.70541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.70581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.70601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.70622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.70650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.70661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.70770: variable 'network_connections' from source: include params 44071 1727204700.70777: variable 'interface' from source: play vars 44071 1727204700.70857: variable 'interface' from source: play vars 44071 1727204700.71095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204700.71115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204700.71139: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.71162: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204700.71207: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204700.71414: variable 'network_connections' from source: include params 44071 1727204700.71417: variable 'interface' from source: play vars 44071 1727204700.71497: variable 'interface' from source: play vars 44071 1727204700.71522: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204700.71584: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204700.71804: variable 'network_connections' from source: include params 44071 1727204700.71808: variable 'interface' from source: play vars 44071 1727204700.71859: variable 'interface' from source: play vars 44071 1727204700.71879: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204700.71938: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204700.72154: variable 'network_connections' from source: include params 44071 1727204700.72159: variable 'interface' from source: play vars 44071 1727204700.72210: variable 'interface' from source: play vars 44071 1727204700.72251: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204700.72298: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204700.72304: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204700.72348: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204700.72498: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204700.72871: variable 'network_connections' from source: include params 44071 1727204700.72875: variable 'interface' from source: play vars 44071 1727204700.72888: variable 'interface' from source: play vars 44071 1727204700.72896: variable 'ansible_distribution' from source: facts 44071 1727204700.72899: variable '__network_rh_distros' from source: role '' defaults 44071 1727204700.72906: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.72918: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204700.73041: variable 'ansible_distribution' from source: facts 44071 1727204700.73046: variable '__network_rh_distros' from source: role '' defaults 44071 1727204700.73049: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.73052: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204700.73168: variable 'ansible_distribution' from source: facts 44071 1727204700.73174: variable '__network_rh_distros' from source: role '' defaults 44071 1727204700.73177: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.73204: variable 'network_provider' from source: set_fact 44071 1727204700.73218: variable 'ansible_facts' from source: unknown 44071 1727204700.73821: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44071 1727204700.73825: when evaluation is False, skipping this task 44071 1727204700.73828: _execute() done 44071 1727204700.73830: dumping result to json 44071 1727204700.73832: done dumping result, returning 44071 1727204700.73841: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-c964-7471-000000001b43] 44071 1727204700.73845: sending task result for task 127b8e07-fff9-c964-7471-000000001b43 44071 1727204700.73951: done sending task result for task 127b8e07-fff9-c964-7471-000000001b43 44071 1727204700.73954: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44071 1727204700.74010: no more pending results, returning what we have 44071 1727204700.74014: results queue empty 44071 1727204700.74016: checking for any_errors_fatal 44071 1727204700.74022: done checking for any_errors_fatal 44071 1727204700.74023: checking for max_fail_percentage 44071 1727204700.74025: done checking for max_fail_percentage 44071 1727204700.74026: checking to see if all hosts have failed and the running result is not ok 44071 1727204700.74026: done checking to see if all hosts have failed 44071 1727204700.74027: getting the remaining hosts for this loop 44071 1727204700.74029: done getting the remaining hosts for this loop 44071 1727204700.74036: getting the next task for host managed-node2 44071 1727204700.74044: done getting next task for host managed-node2 44071 1727204700.74048: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204700.74054: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204700.74088: getting variables 44071 1727204700.74090: in VariableManager get_vars() 44071 1727204700.74143: Calling all_inventory to load vars for managed-node2 44071 1727204700.74145: Calling groups_inventory to load vars for managed-node2 44071 1727204700.74147: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204700.74158: Calling all_plugins_play to load vars for managed-node2 44071 1727204700.74161: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204700.74163: Calling groups_plugins_play to load vars for managed-node2 44071 1727204700.75388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204700.76651: done with get_vars() 44071 1727204700.76695: done getting variables 44071 1727204700.76750: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:05:00 -0400 (0:00:00.111) 0:01:53.084 ***** 44071 1727204700.76786: entering _queue_task() for managed-node2/package 44071 1727204700.77095: worker is 1 (out of 1 available) 44071 1727204700.77109: exiting _queue_task() for managed-node2/package 44071 1727204700.77124: done queuing things up, now waiting for results queue to drain 44071 1727204700.77126: waiting for pending results... 44071 1727204700.77341: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204700.77448: in run() - task 127b8e07-fff9-c964-7471-000000001b44 44071 1727204700.77463: variable 'ansible_search_path' from source: unknown 44071 1727204700.77468: variable 'ansible_search_path' from source: unknown 44071 1727204700.77504: calling self._execute() 44071 1727204700.77595: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204700.77600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204700.77609: variable 'omit' from source: magic vars 44071 1727204700.77930: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.77942: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204700.78043: variable 'network_state' from source: role '' defaults 44071 1727204700.78053: Evaluated conditional (network_state != {}): False 44071 1727204700.78056: when evaluation is False, skipping this task 44071 1727204700.78059: _execute() done 44071 1727204700.78062: dumping result to json 44071 1727204700.78068: done dumping result, returning 44071 1727204700.78076: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-c964-7471-000000001b44] 44071 1727204700.78081: sending task result for task 127b8e07-fff9-c964-7471-000000001b44 44071 1727204700.78193: done sending task result for task 127b8e07-fff9-c964-7471-000000001b44 44071 1727204700.78196: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204700.78249: no more pending results, returning what we have 44071 1727204700.78253: results queue empty 44071 1727204700.78254: checking for any_errors_fatal 44071 1727204700.78264: done checking for any_errors_fatal 44071 1727204700.78265: checking for max_fail_percentage 44071 1727204700.78268: done checking for max_fail_percentage 44071 1727204700.78269: checking to see if all hosts have failed and the running result is not ok 44071 1727204700.78270: done checking to see if all hosts have failed 44071 1727204700.78271: getting the remaining hosts for this loop 44071 1727204700.78272: done getting the remaining hosts for this loop 44071 1727204700.78277: getting the next task for host managed-node2 44071 1727204700.78287: done getting next task for host managed-node2 44071 1727204700.78292: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204700.78298: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204700.78329: getting variables 44071 1727204700.78330: in VariableManager get_vars() 44071 1727204700.78387: Calling all_inventory to load vars for managed-node2 44071 1727204700.78390: Calling groups_inventory to load vars for managed-node2 44071 1727204700.78392: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204700.78403: Calling all_plugins_play to load vars for managed-node2 44071 1727204700.78406: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204700.78409: Calling groups_plugins_play to load vars for managed-node2 44071 1727204700.80149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204700.81411: done with get_vars() 44071 1727204700.81449: done getting variables 44071 1727204700.81501: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:05:00 -0400 (0:00:00.047) 0:01:53.131 ***** 44071 1727204700.81534: entering _queue_task() for managed-node2/package 44071 1727204700.81895: worker is 1 (out of 1 available) 44071 1727204700.81910: exiting _queue_task() for managed-node2/package 44071 1727204700.81925: done queuing things up, now waiting for results queue to drain 44071 1727204700.81927: waiting for pending results... 44071 1727204700.82498: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204700.82504: in run() - task 127b8e07-fff9-c964-7471-000000001b45 44071 1727204700.82508: variable 'ansible_search_path' from source: unknown 44071 1727204700.82511: variable 'ansible_search_path' from source: unknown 44071 1727204700.82527: calling self._execute() 44071 1727204700.82659: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204700.82674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204700.82687: variable 'omit' from source: magic vars 44071 1727204700.83131: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.83250: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204700.83307: variable 'network_state' from source: role '' defaults 44071 1727204700.83325: Evaluated conditional (network_state != {}): False 44071 1727204700.83336: when evaluation is False, skipping this task 44071 1727204700.83344: _execute() done 44071 1727204700.83352: dumping result to json 44071 1727204700.83362: done dumping result, returning 44071 1727204700.83377: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-c964-7471-000000001b45] 44071 1727204700.83387: sending task result for task 127b8e07-fff9-c964-7471-000000001b45 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204700.83688: no more pending results, returning what we have 44071 1727204700.83693: results queue empty 44071 1727204700.83695: checking for any_errors_fatal 44071 1727204700.83707: done checking for any_errors_fatal 44071 1727204700.83708: checking for max_fail_percentage 44071 1727204700.83710: done checking for max_fail_percentage 44071 1727204700.83711: checking to see if all hosts have failed and the running result is not ok 44071 1727204700.83712: done checking to see if all hosts have failed 44071 1727204700.83713: getting the remaining hosts for this loop 44071 1727204700.83715: done getting the remaining hosts for this loop 44071 1727204700.83722: getting the next task for host managed-node2 44071 1727204700.83737: done getting next task for host managed-node2 44071 1727204700.83742: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204700.83749: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204700.83789: getting variables 44071 1727204700.83791: in VariableManager get_vars() 44071 1727204700.83847: Calling all_inventory to load vars for managed-node2 44071 1727204700.83851: Calling groups_inventory to load vars for managed-node2 44071 1727204700.83853: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204700.84071: Calling all_plugins_play to load vars for managed-node2 44071 1727204700.84076: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204700.84084: done sending task result for task 127b8e07-fff9-c964-7471-000000001b45 44071 1727204700.84087: WORKER PROCESS EXITING 44071 1727204700.84091: Calling groups_plugins_play to load vars for managed-node2 44071 1727204700.86129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204700.88430: done with get_vars() 44071 1727204700.88484: done getting variables 44071 1727204700.88558: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:05:00 -0400 (0:00:00.070) 0:01:53.202 ***** 44071 1727204700.88603: entering _queue_task() for managed-node2/service 44071 1727204700.89032: worker is 1 (out of 1 available) 44071 1727204700.89051: exiting _queue_task() for managed-node2/service 44071 1727204700.89068: done queuing things up, now waiting for results queue to drain 44071 1727204700.89070: waiting for pending results... 44071 1727204700.89418: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204700.89623: in run() - task 127b8e07-fff9-c964-7471-000000001b46 44071 1727204700.89658: variable 'ansible_search_path' from source: unknown 44071 1727204700.89671: variable 'ansible_search_path' from source: unknown 44071 1727204700.89719: calling self._execute() 44071 1727204700.89850: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204700.89869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204700.89888: variable 'omit' from source: magic vars 44071 1727204700.90339: variable 'ansible_distribution_major_version' from source: facts 44071 1727204700.90362: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204700.90493: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204700.90717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204700.93378: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204700.93464: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204700.93514: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204700.93567: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204700.93658: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204700.93705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.93759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.93797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.93848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.93872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.93982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.93986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.93989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.94039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.94057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.94115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204700.94149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204700.94183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.94240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204700.94260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204700.94498: variable 'network_connections' from source: include params 44071 1727204700.94632: variable 'interface' from source: play vars 44071 1727204700.94639: variable 'interface' from source: play vars 44071 1727204700.94714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204700.94946: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204700.95004: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204700.95048: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204700.95093: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204700.95148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204700.95178: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204700.95212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204700.95247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204700.95313: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204700.95594: variable 'network_connections' from source: include params 44071 1727204700.95605: variable 'interface' from source: play vars 44071 1727204700.95682: variable 'interface' from source: play vars 44071 1727204700.95711: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204700.95720: when evaluation is False, skipping this task 44071 1727204700.95735: _execute() done 44071 1727204700.95870: dumping result to json 44071 1727204700.95874: done dumping result, returning 44071 1727204700.95876: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001b46] 44071 1727204700.95878: sending task result for task 127b8e07-fff9-c964-7471-000000001b46 44071 1727204700.95962: done sending task result for task 127b8e07-fff9-c964-7471-000000001b46 44071 1727204700.95976: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204700.96036: no more pending results, returning what we have 44071 1727204700.96040: results queue empty 44071 1727204700.96042: checking for any_errors_fatal 44071 1727204700.96050: done checking for any_errors_fatal 44071 1727204700.96050: checking for max_fail_percentage 44071 1727204700.96052: done checking for max_fail_percentage 44071 1727204700.96053: checking to see if all hosts have failed and the running result is not ok 44071 1727204700.96054: done checking to see if all hosts have failed 44071 1727204700.96055: getting the remaining hosts for this loop 44071 1727204700.96056: done getting the remaining hosts for this loop 44071 1727204700.96061: getting the next task for host managed-node2 44071 1727204700.96074: done getting next task for host managed-node2 44071 1727204700.96079: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204700.96085: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204700.96118: getting variables 44071 1727204700.96120: in VariableManager get_vars() 44071 1727204700.96378: Calling all_inventory to load vars for managed-node2 44071 1727204700.96382: Calling groups_inventory to load vars for managed-node2 44071 1727204700.96384: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204700.96395: Calling all_plugins_play to load vars for managed-node2 44071 1727204700.96398: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204700.96401: Calling groups_plugins_play to load vars for managed-node2 44071 1727204700.98581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204701.01013: done with get_vars() 44071 1727204701.01072: done getting variables 44071 1727204701.01147: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.125) 0:01:53.328 ***** 44071 1727204701.01199: entering _queue_task() for managed-node2/service 44071 1727204701.01674: worker is 1 (out of 1 available) 44071 1727204701.01690: exiting _queue_task() for managed-node2/service 44071 1727204701.01707: done queuing things up, now waiting for results queue to drain 44071 1727204701.01709: waiting for pending results... 44071 1727204701.02086: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204701.02272: in run() - task 127b8e07-fff9-c964-7471-000000001b47 44071 1727204701.02276: variable 'ansible_search_path' from source: unknown 44071 1727204701.02279: variable 'ansible_search_path' from source: unknown 44071 1727204701.02287: calling self._execute() 44071 1727204701.02404: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204701.02418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204701.02437: variable 'omit' from source: magic vars 44071 1727204701.02875: variable 'ansible_distribution_major_version' from source: facts 44071 1727204701.02893: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204701.03275: variable 'network_provider' from source: set_fact 44071 1727204701.03279: variable 'network_state' from source: role '' defaults 44071 1727204701.03283: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44071 1727204701.03285: variable 'omit' from source: magic vars 44071 1727204701.03287: variable 'omit' from source: magic vars 44071 1727204701.03289: variable 'network_service_name' from source: role '' defaults 44071 1727204701.03303: variable 'network_service_name' from source: role '' defaults 44071 1727204701.03427: variable '__network_provider_setup' from source: role '' defaults 44071 1727204701.03441: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204701.03512: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204701.03525: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204701.03602: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204701.03864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204701.06379: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204701.06464: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204701.06512: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204701.06578: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204701.06611: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204701.06705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204701.06744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204701.06778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204701.06827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204701.06848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204701.06905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204701.06935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204701.06967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204701.07014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204701.07032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204701.07297: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204701.07438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204701.07470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204701.07502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204701.07551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204701.07671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204701.07678: variable 'ansible_python' from source: facts 44071 1727204701.07702: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204701.07799: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204701.07892: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204701.08017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204701.08047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204701.08078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204701.08118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204701.08138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204701.08273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204701.08285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204701.08310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204701.08362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204701.08387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204701.08555: variable 'network_connections' from source: include params 44071 1727204701.08577: variable 'interface' from source: play vars 44071 1727204701.08668: variable 'interface' from source: play vars 44071 1727204701.08800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204701.09027: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204701.09100: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204701.09155: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204701.09209: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204701.09370: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204701.09375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204701.09395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204701.09440: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204701.09501: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204701.09840: variable 'network_connections' from source: include params 44071 1727204701.09851: variable 'interface' from source: play vars 44071 1727204701.09932: variable 'interface' from source: play vars 44071 1727204701.09983: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204701.10079: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204701.10393: variable 'network_connections' from source: include params 44071 1727204701.10403: variable 'interface' from source: play vars 44071 1727204701.10576: variable 'interface' from source: play vars 44071 1727204701.10580: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204701.10599: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204701.10930: variable 'network_connections' from source: include params 44071 1727204701.10943: variable 'interface' from source: play vars 44071 1727204701.11021: variable 'interface' from source: play vars 44071 1727204701.11090: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204701.11160: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204701.11176: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204701.11246: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204701.11499: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204701.12075: variable 'network_connections' from source: include params 44071 1727204701.12087: variable 'interface' from source: play vars 44071 1727204701.12161: variable 'interface' from source: play vars 44071 1727204701.12184: variable 'ansible_distribution' from source: facts 44071 1727204701.12192: variable '__network_rh_distros' from source: role '' defaults 44071 1727204701.12272: variable 'ansible_distribution_major_version' from source: facts 44071 1727204701.12276: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204701.12427: variable 'ansible_distribution' from source: facts 44071 1727204701.12440: variable '__network_rh_distros' from source: role '' defaults 44071 1727204701.12450: variable 'ansible_distribution_major_version' from source: facts 44071 1727204701.12461: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204701.12656: variable 'ansible_distribution' from source: facts 44071 1727204701.12667: variable '__network_rh_distros' from source: role '' defaults 44071 1727204701.12677: variable 'ansible_distribution_major_version' from source: facts 44071 1727204701.12719: variable 'network_provider' from source: set_fact 44071 1727204701.12751: variable 'omit' from source: magic vars 44071 1727204701.12792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204701.12826: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204701.12854: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204701.12880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204701.12970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204701.12973: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204701.12976: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204701.12978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204701.13057: Set connection var ansible_connection to ssh 44071 1727204701.13072: Set connection var ansible_timeout to 10 44071 1727204701.13082: Set connection var ansible_pipelining to False 44071 1727204701.13092: Set connection var ansible_shell_type to sh 44071 1727204701.13102: Set connection var ansible_shell_executable to /bin/sh 44071 1727204701.13114: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204701.13147: variable 'ansible_shell_executable' from source: unknown 44071 1727204701.13154: variable 'ansible_connection' from source: unknown 44071 1727204701.13161: variable 'ansible_module_compression' from source: unknown 44071 1727204701.13169: variable 'ansible_shell_type' from source: unknown 44071 1727204701.13175: variable 'ansible_shell_executable' from source: unknown 44071 1727204701.13182: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204701.13189: variable 'ansible_pipelining' from source: unknown 44071 1727204701.13196: variable 'ansible_timeout' from source: unknown 44071 1727204701.13205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204701.13371: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204701.13379: variable 'omit' from source: magic vars 44071 1727204701.13382: starting attempt loop 44071 1727204701.13384: running the handler 44071 1727204701.13469: variable 'ansible_facts' from source: unknown 44071 1727204701.14543: _low_level_execute_command(): starting 44071 1727204701.14557: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204701.15321: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204701.15382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204701.15387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204701.15459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204701.15491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204701.15659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204701.17384: stdout chunk (state=3): >>>/root <<< 44071 1727204701.17599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204701.17603: stdout chunk (state=3): >>><<< 44071 1727204701.17606: stderr chunk (state=3): >>><<< 44071 1727204701.17740: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204701.17744: _low_level_execute_command(): starting 44071 1727204701.17748: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204701.1763215-50649-236777911496975 `" && echo ansible-tmp-1727204701.1763215-50649-236777911496975="` echo /root/.ansible/tmp/ansible-tmp-1727204701.1763215-50649-236777911496975 `" ) && sleep 0' 44071 1727204701.18395: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204701.18436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204701.18448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204701.18551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204701.18569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204701.18594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204701.18607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204701.18714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204701.20770: stdout chunk (state=3): >>>ansible-tmp-1727204701.1763215-50649-236777911496975=/root/.ansible/tmp/ansible-tmp-1727204701.1763215-50649-236777911496975 <<< 44071 1727204701.20979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204701.21006: stdout chunk (state=3): >>><<< 44071 1727204701.21019: stderr chunk (state=3): >>><<< 44071 1727204701.21173: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204701.1763215-50649-236777911496975=/root/.ansible/tmp/ansible-tmp-1727204701.1763215-50649-236777911496975 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204701.21177: variable 'ansible_module_compression' from source: unknown 44071 1727204701.21180: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44071 1727204701.21241: variable 'ansible_facts' from source: unknown 44071 1727204701.21518: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204701.1763215-50649-236777911496975/AnsiballZ_systemd.py 44071 1727204701.21926: Sending initial data 44071 1727204701.21929: Sent initial data (156 bytes) 44071 1727204701.22590: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204701.22683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204701.22720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204701.22724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204701.22752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204701.22860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204701.24622: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204701.24660: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204701.24727: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204701.24818: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp8_sztuql /root/.ansible/tmp/ansible-tmp-1727204701.1763215-50649-236777911496975/AnsiballZ_systemd.py <<< 44071 1727204701.24822: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204701.1763215-50649-236777911496975/AnsiballZ_systemd.py" <<< 44071 1727204701.24890: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp8_sztuql" to remote "/root/.ansible/tmp/ansible-tmp-1727204701.1763215-50649-236777911496975/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204701.1763215-50649-236777911496975/AnsiballZ_systemd.py" <<< 44071 1727204701.26769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204701.26838: stderr chunk (state=3): >>><<< 44071 1727204701.26842: stdout chunk (state=3): >>><<< 44071 1727204701.26946: done transferring module to remote 44071 1727204701.26950: _low_level_execute_command(): starting 44071 1727204701.26953: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204701.1763215-50649-236777911496975/ /root/.ansible/tmp/ansible-tmp-1727204701.1763215-50649-236777911496975/AnsiballZ_systemd.py && sleep 0' 44071 1727204701.27777: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204701.27782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204701.27785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204701.27803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204701.28103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204701.30306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204701.30311: stdout chunk (state=3): >>><<< 44071 1727204701.30314: stderr chunk (state=3): >>><<< 44071 1727204701.30317: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204701.30320: _low_level_execute_command(): starting 44071 1727204701.30322: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204701.1763215-50649-236777911496975/AnsiballZ_systemd.py && sleep 0' 44071 1727204701.31362: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204701.31370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204701.31374: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204701.31377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204701.31925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204701.32010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204701.64201: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4595712", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3509665792", "CPUUsageNSec": "1633723000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitC<<< 44071 1727204701.64217: stdout chunk (state=3): >>>ORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext":<<< 44071 1727204701.64221: stdout chunk (state=3): >>> "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44071 1727204701.66272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204701.66321: stderr chunk (state=3): >>><<< 44071 1727204701.66325: stdout chunk (state=3): >>><<< 44071 1727204701.66350: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4595712", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3509665792", "CPUUsageNSec": "1633723000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204701.66489: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204701.1763215-50649-236777911496975/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204701.66504: _low_level_execute_command(): starting 44071 1727204701.66509: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204701.1763215-50649-236777911496975/ > /dev/null 2>&1 && sleep 0' 44071 1727204701.66972: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204701.66976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204701.66979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204701.66991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204701.67050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204701.67057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204701.67060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204701.67128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204701.69092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204701.69130: stderr chunk (state=3): >>><<< 44071 1727204701.69137: stdout chunk (state=3): >>><<< 44071 1727204701.69148: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204701.69156: handler run complete 44071 1727204701.69202: attempt loop complete, returning result 44071 1727204701.69205: _execute() done 44071 1727204701.69208: dumping result to json 44071 1727204701.69227: done dumping result, returning 44071 1727204701.69238: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-c964-7471-000000001b47] 44071 1727204701.69241: sending task result for task 127b8e07-fff9-c964-7471-000000001b47 44071 1727204701.69471: done sending task result for task 127b8e07-fff9-c964-7471-000000001b47 44071 1727204701.69474: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204701.69540: no more pending results, returning what we have 44071 1727204701.69544: results queue empty 44071 1727204701.69544: checking for any_errors_fatal 44071 1727204701.69551: done checking for any_errors_fatal 44071 1727204701.69552: checking for max_fail_percentage 44071 1727204701.69554: done checking for max_fail_percentage 44071 1727204701.69555: checking to see if all hosts have failed and the running result is not ok 44071 1727204701.69555: done checking to see if all hosts have failed 44071 1727204701.69556: getting the remaining hosts for this loop 44071 1727204701.69558: done getting the remaining hosts for this loop 44071 1727204701.69562: getting the next task for host managed-node2 44071 1727204701.69572: done getting next task for host managed-node2 44071 1727204701.69576: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204701.69581: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204701.69596: getting variables 44071 1727204701.69598: in VariableManager get_vars() 44071 1727204701.69642: Calling all_inventory to load vars for managed-node2 44071 1727204701.69644: Calling groups_inventory to load vars for managed-node2 44071 1727204701.69703: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204701.69714: Calling all_plugins_play to load vars for managed-node2 44071 1727204701.69717: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204701.69720: Calling groups_plugins_play to load vars for managed-node2 44071 1727204701.70753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204701.72108: done with get_vars() 44071 1727204701.72129: done getting variables 44071 1727204701.72188: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.710) 0:01:54.038 ***** 44071 1727204701.72221: entering _queue_task() for managed-node2/service 44071 1727204701.72525: worker is 1 (out of 1 available) 44071 1727204701.72540: exiting _queue_task() for managed-node2/service 44071 1727204701.72555: done queuing things up, now waiting for results queue to drain 44071 1727204701.72557: waiting for pending results... 44071 1727204701.72769: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204701.72883: in run() - task 127b8e07-fff9-c964-7471-000000001b48 44071 1727204701.72898: variable 'ansible_search_path' from source: unknown 44071 1727204701.72903: variable 'ansible_search_path' from source: unknown 44071 1727204701.72938: calling self._execute() 44071 1727204701.73023: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204701.73029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204701.73040: variable 'omit' from source: magic vars 44071 1727204701.73358: variable 'ansible_distribution_major_version' from source: facts 44071 1727204701.73372: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204701.73462: variable 'network_provider' from source: set_fact 44071 1727204701.73467: Evaluated conditional (network_provider == "nm"): True 44071 1727204701.73539: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204701.73608: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204701.73743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204701.75412: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204701.75462: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204701.75492: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204701.75523: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204701.75545: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204701.75626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204701.75650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204701.75669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204701.75698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204701.75709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204701.75751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204701.75769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204701.75788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204701.75815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204701.75826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204701.75862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204701.75881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204701.75899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204701.75925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204701.75937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204701.76056: variable 'network_connections' from source: include params 44071 1727204701.76069: variable 'interface' from source: play vars 44071 1727204701.76128: variable 'interface' from source: play vars 44071 1727204701.76189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204701.76315: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204701.76346: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204701.76371: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204701.76397: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204701.76437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204701.76452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204701.76472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204701.76491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204701.76538: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204701.76727: variable 'network_connections' from source: include params 44071 1727204701.76731: variable 'interface' from source: play vars 44071 1727204701.76783: variable 'interface' from source: play vars 44071 1727204701.76808: Evaluated conditional (__network_wpa_supplicant_required): False 44071 1727204701.76811: when evaluation is False, skipping this task 44071 1727204701.76814: _execute() done 44071 1727204701.76818: dumping result to json 44071 1727204701.76822: done dumping result, returning 44071 1727204701.76831: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-c964-7471-000000001b48] 44071 1727204701.76845: sending task result for task 127b8e07-fff9-c964-7471-000000001b48 44071 1727204701.76940: done sending task result for task 127b8e07-fff9-c964-7471-000000001b48 44071 1727204701.76944: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44071 1727204701.76996: no more pending results, returning what we have 44071 1727204701.77000: results queue empty 44071 1727204701.77001: checking for any_errors_fatal 44071 1727204701.77018: done checking for any_errors_fatal 44071 1727204701.77019: checking for max_fail_percentage 44071 1727204701.77021: done checking for max_fail_percentage 44071 1727204701.77022: checking to see if all hosts have failed and the running result is not ok 44071 1727204701.77023: done checking to see if all hosts have failed 44071 1727204701.77023: getting the remaining hosts for this loop 44071 1727204701.77025: done getting the remaining hosts for this loop 44071 1727204701.77030: getting the next task for host managed-node2 44071 1727204701.77042: done getting next task for host managed-node2 44071 1727204701.77046: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204701.77051: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204701.77087: getting variables 44071 1727204701.77089: in VariableManager get_vars() 44071 1727204701.77136: Calling all_inventory to load vars for managed-node2 44071 1727204701.77139: Calling groups_inventory to load vars for managed-node2 44071 1727204701.77141: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204701.77152: Calling all_plugins_play to load vars for managed-node2 44071 1727204701.77155: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204701.77158: Calling groups_plugins_play to load vars for managed-node2 44071 1727204701.78229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204701.79477: done with get_vars() 44071 1727204701.79510: done getting variables 44071 1727204701.79563: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.073) 0:01:54.112 ***** 44071 1727204701.79596: entering _queue_task() for managed-node2/service 44071 1727204701.79909: worker is 1 (out of 1 available) 44071 1727204701.79927: exiting _queue_task() for managed-node2/service 44071 1727204701.79945: done queuing things up, now waiting for results queue to drain 44071 1727204701.79947: waiting for pending results... 44071 1727204701.80155: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204701.80260: in run() - task 127b8e07-fff9-c964-7471-000000001b49 44071 1727204701.80275: variable 'ansible_search_path' from source: unknown 44071 1727204701.80281: variable 'ansible_search_path' from source: unknown 44071 1727204701.80316: calling self._execute() 44071 1727204701.80402: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204701.80410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204701.80419: variable 'omit' from source: magic vars 44071 1727204701.80740: variable 'ansible_distribution_major_version' from source: facts 44071 1727204701.80752: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204701.80848: variable 'network_provider' from source: set_fact 44071 1727204701.80854: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204701.80857: when evaluation is False, skipping this task 44071 1727204701.80860: _execute() done 44071 1727204701.80867: dumping result to json 44071 1727204701.80870: done dumping result, returning 44071 1727204701.80878: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-c964-7471-000000001b49] 44071 1727204701.80883: sending task result for task 127b8e07-fff9-c964-7471-000000001b49 44071 1727204701.80988: done sending task result for task 127b8e07-fff9-c964-7471-000000001b49 44071 1727204701.80991: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204701.81041: no more pending results, returning what we have 44071 1727204701.81045: results queue empty 44071 1727204701.81046: checking for any_errors_fatal 44071 1727204701.81056: done checking for any_errors_fatal 44071 1727204701.81056: checking for max_fail_percentage 44071 1727204701.81058: done checking for max_fail_percentage 44071 1727204701.81059: checking to see if all hosts have failed and the running result is not ok 44071 1727204701.81060: done checking to see if all hosts have failed 44071 1727204701.81061: getting the remaining hosts for this loop 44071 1727204701.81062: done getting the remaining hosts for this loop 44071 1727204701.81069: getting the next task for host managed-node2 44071 1727204701.81079: done getting next task for host managed-node2 44071 1727204701.81084: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204701.81090: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204701.81120: getting variables 44071 1727204701.81122: in VariableManager get_vars() 44071 1727204701.81176: Calling all_inventory to load vars for managed-node2 44071 1727204701.81180: Calling groups_inventory to load vars for managed-node2 44071 1727204701.81182: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204701.81193: Calling all_plugins_play to load vars for managed-node2 44071 1727204701.81195: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204701.81198: Calling groups_plugins_play to load vars for managed-node2 44071 1727204701.88327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204701.90119: done with get_vars() 44071 1727204701.90154: done getting variables 44071 1727204701.90198: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.106) 0:01:54.218 ***** 44071 1727204701.90225: entering _queue_task() for managed-node2/copy 44071 1727204701.90540: worker is 1 (out of 1 available) 44071 1727204701.90555: exiting _queue_task() for managed-node2/copy 44071 1727204701.90571: done queuing things up, now waiting for results queue to drain 44071 1727204701.90574: waiting for pending results... 44071 1727204701.90797: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204701.90926: in run() - task 127b8e07-fff9-c964-7471-000000001b4a 44071 1727204701.90944: variable 'ansible_search_path' from source: unknown 44071 1727204701.90948: variable 'ansible_search_path' from source: unknown 44071 1727204701.90985: calling self._execute() 44071 1727204701.91081: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204701.91087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204701.91097: variable 'omit' from source: magic vars 44071 1727204701.91430: variable 'ansible_distribution_major_version' from source: facts 44071 1727204701.91444: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204701.91542: variable 'network_provider' from source: set_fact 44071 1727204701.91548: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204701.91552: when evaluation is False, skipping this task 44071 1727204701.91555: _execute() done 44071 1727204701.91559: dumping result to json 44071 1727204701.91562: done dumping result, returning 44071 1727204701.91574: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-c964-7471-000000001b4a] 44071 1727204701.91587: sending task result for task 127b8e07-fff9-c964-7471-000000001b4a 44071 1727204701.91696: done sending task result for task 127b8e07-fff9-c964-7471-000000001b4a 44071 1727204701.91699: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44071 1727204701.91761: no more pending results, returning what we have 44071 1727204701.91767: results queue empty 44071 1727204701.91768: checking for any_errors_fatal 44071 1727204701.91776: done checking for any_errors_fatal 44071 1727204701.91777: checking for max_fail_percentage 44071 1727204701.91778: done checking for max_fail_percentage 44071 1727204701.91779: checking to see if all hosts have failed and the running result is not ok 44071 1727204701.91780: done checking to see if all hosts have failed 44071 1727204701.91781: getting the remaining hosts for this loop 44071 1727204701.91782: done getting the remaining hosts for this loop 44071 1727204701.91788: getting the next task for host managed-node2 44071 1727204701.91797: done getting next task for host managed-node2 44071 1727204701.91801: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204701.91806: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204701.91838: getting variables 44071 1727204701.91840: in VariableManager get_vars() 44071 1727204701.91907: Calling all_inventory to load vars for managed-node2 44071 1727204701.91910: Calling groups_inventory to load vars for managed-node2 44071 1727204701.91912: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204701.91926: Calling all_plugins_play to load vars for managed-node2 44071 1727204701.91929: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204701.91932: Calling groups_plugins_play to load vars for managed-node2 44071 1727204701.93719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204701.95959: done with get_vars() 44071 1727204701.96008: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.058) 0:01:54.277 ***** 44071 1727204701.96114: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204701.96521: worker is 1 (out of 1 available) 44071 1727204701.96536: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204701.96551: done queuing things up, now waiting for results queue to drain 44071 1727204701.96553: waiting for pending results... 44071 1727204701.96989: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204701.97071: in run() - task 127b8e07-fff9-c964-7471-000000001b4b 44071 1727204701.97108: variable 'ansible_search_path' from source: unknown 44071 1727204701.97111: variable 'ansible_search_path' from source: unknown 44071 1727204701.97170: calling self._execute() 44071 1727204701.97272: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204701.97286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204701.97300: variable 'omit' from source: magic vars 44071 1727204701.97761: variable 'ansible_distribution_major_version' from source: facts 44071 1727204701.97764: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204701.97773: variable 'omit' from source: magic vars 44071 1727204701.97842: variable 'omit' from source: magic vars 44071 1727204701.98085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204702.01015: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204702.01102: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204702.01158: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204702.01241: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204702.01252: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204702.01361: variable 'network_provider' from source: set_fact 44071 1727204702.01521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204702.01570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204702.01771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204702.01775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204702.01778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204702.01781: variable 'omit' from source: magic vars 44071 1727204702.01893: variable 'omit' from source: magic vars 44071 1727204702.02023: variable 'network_connections' from source: include params 44071 1727204702.02045: variable 'interface' from source: play vars 44071 1727204702.02123: variable 'interface' from source: play vars 44071 1727204702.02290: variable 'omit' from source: magic vars 44071 1727204702.02336: variable '__lsr_ansible_managed' from source: task vars 44071 1727204702.02378: variable '__lsr_ansible_managed' from source: task vars 44071 1727204702.02594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44071 1727204702.02856: Loaded config def from plugin (lookup/template) 44071 1727204702.02868: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44071 1727204702.02970: File lookup term: get_ansible_managed.j2 44071 1727204702.02974: variable 'ansible_search_path' from source: unknown 44071 1727204702.02977: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44071 1727204702.02981: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44071 1727204702.02986: variable 'ansible_search_path' from source: unknown 44071 1727204702.10087: variable 'ansible_managed' from source: unknown 44071 1727204702.10213: variable 'omit' from source: magic vars 44071 1727204702.10240: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204702.10264: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204702.10282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204702.10300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204702.10309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204702.10332: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204702.10338: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204702.10341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204702.10417: Set connection var ansible_connection to ssh 44071 1727204702.10424: Set connection var ansible_timeout to 10 44071 1727204702.10430: Set connection var ansible_pipelining to False 44071 1727204702.10437: Set connection var ansible_shell_type to sh 44071 1727204702.10440: Set connection var ansible_shell_executable to /bin/sh 44071 1727204702.10447: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204702.10468: variable 'ansible_shell_executable' from source: unknown 44071 1727204702.10472: variable 'ansible_connection' from source: unknown 44071 1727204702.10474: variable 'ansible_module_compression' from source: unknown 44071 1727204702.10477: variable 'ansible_shell_type' from source: unknown 44071 1727204702.10480: variable 'ansible_shell_executable' from source: unknown 44071 1727204702.10483: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204702.10488: variable 'ansible_pipelining' from source: unknown 44071 1727204702.10490: variable 'ansible_timeout' from source: unknown 44071 1727204702.10492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204702.10619: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204702.10632: variable 'omit' from source: magic vars 44071 1727204702.10641: starting attempt loop 44071 1727204702.10644: running the handler 44071 1727204702.10683: _low_level_execute_command(): starting 44071 1727204702.10687: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204702.11414: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204702.11419: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204702.11422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204702.11472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204702.11476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204702.11480: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204702.11483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204702.11486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204702.11488: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204702.11490: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204702.11499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204702.11505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204702.11519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204702.11525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204702.11668: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204702.11672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204702.11678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204702.11681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204702.11684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204702.11802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204702.13563: stdout chunk (state=3): >>>/root <<< 44071 1727204702.13688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204702.13746: stderr chunk (state=3): >>><<< 44071 1727204702.13749: stdout chunk (state=3): >>><<< 44071 1727204702.13763: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204702.13783: _low_level_execute_command(): starting 44071 1727204702.13786: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204702.1377037-50683-3642535310043 `" && echo ansible-tmp-1727204702.1377037-50683-3642535310043="` echo /root/.ansible/tmp/ansible-tmp-1727204702.1377037-50683-3642535310043 `" ) && sleep 0' 44071 1727204702.14378: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204702.14472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204702.14534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204702.16551: stdout chunk (state=3): >>>ansible-tmp-1727204702.1377037-50683-3642535310043=/root/.ansible/tmp/ansible-tmp-1727204702.1377037-50683-3642535310043 <<< 44071 1727204702.16656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204702.16724: stderr chunk (state=3): >>><<< 44071 1727204702.16727: stdout chunk (state=3): >>><<< 44071 1727204702.16746: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204702.1377037-50683-3642535310043=/root/.ansible/tmp/ansible-tmp-1727204702.1377037-50683-3642535310043 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204702.16795: variable 'ansible_module_compression' from source: unknown 44071 1727204702.16835: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44071 1727204702.16867: variable 'ansible_facts' from source: unknown 44071 1727204702.16935: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204702.1377037-50683-3642535310043/AnsiballZ_network_connections.py 44071 1727204702.17057: Sending initial data 44071 1727204702.17064: Sent initial data (166 bytes) 44071 1727204702.17796: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204702.17817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204702.17850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204702.17863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204702.17968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204702.19556: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204702.19619: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204702.19691: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp1xek4sig /root/.ansible/tmp/ansible-tmp-1727204702.1377037-50683-3642535310043/AnsiballZ_network_connections.py <<< 44071 1727204702.19700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204702.1377037-50683-3642535310043/AnsiballZ_network_connections.py" <<< 44071 1727204702.19762: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp1xek4sig" to remote "/root/.ansible/tmp/ansible-tmp-1727204702.1377037-50683-3642535310043/AnsiballZ_network_connections.py" <<< 44071 1727204702.19766: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204702.1377037-50683-3642535310043/AnsiballZ_network_connections.py" <<< 44071 1727204702.20627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204702.20709: stderr chunk (state=3): >>><<< 44071 1727204702.20713: stdout chunk (state=3): >>><<< 44071 1727204702.20732: done transferring module to remote 44071 1727204702.20749: _low_level_execute_command(): starting 44071 1727204702.20753: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204702.1377037-50683-3642535310043/ /root/.ansible/tmp/ansible-tmp-1727204702.1377037-50683-3642535310043/AnsiballZ_network_connections.py && sleep 0' 44071 1727204702.21495: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204702.21553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204702.23415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204702.23486: stderr chunk (state=3): >>><<< 44071 1727204702.23490: stdout chunk (state=3): >>><<< 44071 1727204702.23503: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204702.23506: _low_level_execute_command(): starting 44071 1727204702.23512: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204702.1377037-50683-3642535310043/AnsiballZ_network_connections.py && sleep 0' 44071 1727204702.24031: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204702.24040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204702.24043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204702.24099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204702.24107: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204702.24114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204702.24186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204702.53730: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_36ifxrls/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_36ifxrls/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/2461fed0-dcf1-466d-b59f-3f5d810ecefa: error=unknown <<< 44071 1727204702.53858: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44071 1727204702.55796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204702.55824: stderr chunk (state=3): >>><<< 44071 1727204702.55827: stdout chunk (state=3): >>><<< 44071 1727204702.55973: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_36ifxrls/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_36ifxrls/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/2461fed0-dcf1-466d-b59f-3f5d810ecefa: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204702.55977: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204702.1377037-50683-3642535310043/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204702.55979: _low_level_execute_command(): starting 44071 1727204702.55982: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204702.1377037-50683-3642535310043/ > /dev/null 2>&1 && sleep 0' 44071 1727204702.56587: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204702.56604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204702.56623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204702.56643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204702.56659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204702.56730: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204702.56764: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204702.56786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204702.56812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204702.56914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204702.58897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204702.58954: stderr chunk (state=3): >>><<< 44071 1727204702.58958: stdout chunk (state=3): >>><<< 44071 1727204702.58984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204702.58992: handler run complete 44071 1727204702.59071: attempt loop complete, returning result 44071 1727204702.59075: _execute() done 44071 1727204702.59077: dumping result to json 44071 1727204702.59079: done dumping result, returning 44071 1727204702.59082: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-c964-7471-000000001b4b] 44071 1727204702.59084: sending task result for task 127b8e07-fff9-c964-7471-000000001b4b 44071 1727204702.59272: done sending task result for task 127b8e07-fff9-c964-7471-000000001b4b 44071 1727204702.59276: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 44071 1727204702.59409: no more pending results, returning what we have 44071 1727204702.59413: results queue empty 44071 1727204702.59414: checking for any_errors_fatal 44071 1727204702.59421: done checking for any_errors_fatal 44071 1727204702.59421: checking for max_fail_percentage 44071 1727204702.59423: done checking for max_fail_percentage 44071 1727204702.59424: checking to see if all hosts have failed and the running result is not ok 44071 1727204702.59425: done checking to see if all hosts have failed 44071 1727204702.59426: getting the remaining hosts for this loop 44071 1727204702.59428: done getting the remaining hosts for this loop 44071 1727204702.59434: getting the next task for host managed-node2 44071 1727204702.59583: done getting next task for host managed-node2 44071 1727204702.59588: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204702.59594: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204702.59611: getting variables 44071 1727204702.59613: in VariableManager get_vars() 44071 1727204702.59660: Calling all_inventory to load vars for managed-node2 44071 1727204702.59663: Calling groups_inventory to load vars for managed-node2 44071 1727204702.59668: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204702.59792: Calling all_plugins_play to load vars for managed-node2 44071 1727204702.59798: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204702.59803: Calling groups_plugins_play to load vars for managed-node2 44071 1727204702.64406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204702.69284: done with get_vars() 44071 1727204702.69330: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.735) 0:01:55.012 ***** 44071 1727204702.69641: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204702.70681: worker is 1 (out of 1 available) 44071 1727204702.70697: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204702.70711: done queuing things up, now waiting for results queue to drain 44071 1727204702.70713: waiting for pending results... 44071 1727204702.71187: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204702.71486: in run() - task 127b8e07-fff9-c964-7471-000000001b4c 44071 1727204702.71772: variable 'ansible_search_path' from source: unknown 44071 1727204702.71776: variable 'ansible_search_path' from source: unknown 44071 1727204702.71780: calling self._execute() 44071 1727204702.71883: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204702.72171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204702.72175: variable 'omit' from source: magic vars 44071 1727204702.72731: variable 'ansible_distribution_major_version' from source: facts 44071 1727204702.73172: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204702.73175: variable 'network_state' from source: role '' defaults 44071 1727204702.73178: Evaluated conditional (network_state != {}): False 44071 1727204702.73180: when evaluation is False, skipping this task 44071 1727204702.73183: _execute() done 44071 1727204702.73185: dumping result to json 44071 1727204702.73187: done dumping result, returning 44071 1727204702.73189: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-c964-7471-000000001b4c] 44071 1727204702.73191: sending task result for task 127b8e07-fff9-c964-7471-000000001b4c skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204702.73552: no more pending results, returning what we have 44071 1727204702.73556: results queue empty 44071 1727204702.73558: checking for any_errors_fatal 44071 1727204702.73575: done checking for any_errors_fatal 44071 1727204702.73576: checking for max_fail_percentage 44071 1727204702.73578: done checking for max_fail_percentage 44071 1727204702.73579: checking to see if all hosts have failed and the running result is not ok 44071 1727204702.73579: done checking to see if all hosts have failed 44071 1727204702.73580: getting the remaining hosts for this loop 44071 1727204702.73582: done getting the remaining hosts for this loop 44071 1727204702.73586: getting the next task for host managed-node2 44071 1727204702.73596: done getting next task for host managed-node2 44071 1727204702.73600: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204702.73607: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204702.73643: getting variables 44071 1727204702.73645: in VariableManager get_vars() 44071 1727204702.73997: Calling all_inventory to load vars for managed-node2 44071 1727204702.74001: Calling groups_inventory to load vars for managed-node2 44071 1727204702.74003: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204702.74010: done sending task result for task 127b8e07-fff9-c964-7471-000000001b4c 44071 1727204702.74013: WORKER PROCESS EXITING 44071 1727204702.74025: Calling all_plugins_play to load vars for managed-node2 44071 1727204702.74027: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204702.74030: Calling groups_plugins_play to load vars for managed-node2 44071 1727204702.78085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204702.81514: done with get_vars() 44071 1727204702.81564: done getting variables 44071 1727204702.81640: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.120) 0:01:55.133 ***** 44071 1727204702.81687: entering _queue_task() for managed-node2/debug 44071 1727204702.82145: worker is 1 (out of 1 available) 44071 1727204702.82161: exiting _queue_task() for managed-node2/debug 44071 1727204702.82283: done queuing things up, now waiting for results queue to drain 44071 1727204702.82286: waiting for pending results... 44071 1727204702.82546: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204702.82742: in run() - task 127b8e07-fff9-c964-7471-000000001b4d 44071 1727204702.82770: variable 'ansible_search_path' from source: unknown 44071 1727204702.82780: variable 'ansible_search_path' from source: unknown 44071 1727204702.82837: calling self._execute() 44071 1727204702.82961: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204702.82978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204702.82995: variable 'omit' from source: magic vars 44071 1727204702.83461: variable 'ansible_distribution_major_version' from source: facts 44071 1727204702.83492: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204702.83510: variable 'omit' from source: magic vars 44071 1727204702.83774: variable 'omit' from source: magic vars 44071 1727204702.83821: variable 'omit' from source: magic vars 44071 1727204702.83873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204702.83955: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204702.84042: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204702.84371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204702.84375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204702.84377: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204702.84381: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204702.84386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204702.84389: Set connection var ansible_connection to ssh 44071 1727204702.84525: Set connection var ansible_timeout to 10 44071 1727204702.84542: Set connection var ansible_pipelining to False 44071 1727204702.84622: Set connection var ansible_shell_type to sh 44071 1727204702.84636: Set connection var ansible_shell_executable to /bin/sh 44071 1727204702.84650: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204702.84685: variable 'ansible_shell_executable' from source: unknown 44071 1727204702.84941: variable 'ansible_connection' from source: unknown 44071 1727204702.84945: variable 'ansible_module_compression' from source: unknown 44071 1727204702.84948: variable 'ansible_shell_type' from source: unknown 44071 1727204702.84950: variable 'ansible_shell_executable' from source: unknown 44071 1727204702.84953: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204702.84955: variable 'ansible_pipelining' from source: unknown 44071 1727204702.84958: variable 'ansible_timeout' from source: unknown 44071 1727204702.84960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204702.85145: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204702.85286: variable 'omit' from source: magic vars 44071 1727204702.85297: starting attempt loop 44071 1727204702.85304: running the handler 44071 1727204702.85572: variable '__network_connections_result' from source: set_fact 44071 1727204702.85849: handler run complete 44071 1727204702.85953: attempt loop complete, returning result 44071 1727204702.85961: _execute() done 44071 1727204702.85971: dumping result to json 44071 1727204702.85978: done dumping result, returning 44071 1727204702.85990: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-c964-7471-000000001b4d] 44071 1727204702.86072: sending task result for task 127b8e07-fff9-c964-7471-000000001b4d 44071 1727204702.86436: done sending task result for task 127b8e07-fff9-c964-7471-000000001b4d 44071 1727204702.86440: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 44071 1727204702.86532: no more pending results, returning what we have 44071 1727204702.86539: results queue empty 44071 1727204702.86540: checking for any_errors_fatal 44071 1727204702.86547: done checking for any_errors_fatal 44071 1727204702.86548: checking for max_fail_percentage 44071 1727204702.86550: done checking for max_fail_percentage 44071 1727204702.86551: checking to see if all hosts have failed and the running result is not ok 44071 1727204702.86551: done checking to see if all hosts have failed 44071 1727204702.86552: getting the remaining hosts for this loop 44071 1727204702.86554: done getting the remaining hosts for this loop 44071 1727204702.86559: getting the next task for host managed-node2 44071 1727204702.86571: done getting next task for host managed-node2 44071 1727204702.86576: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204702.86582: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204702.86598: getting variables 44071 1727204702.86600: in VariableManager get_vars() 44071 1727204702.86653: Calling all_inventory to load vars for managed-node2 44071 1727204702.86656: Calling groups_inventory to load vars for managed-node2 44071 1727204702.86659: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204702.86777: Calling all_plugins_play to load vars for managed-node2 44071 1727204702.86782: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204702.86786: Calling groups_plugins_play to load vars for managed-node2 44071 1727204702.90546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204702.95816: done with get_vars() 44071 1727204702.95978: done getting variables 44071 1727204702.96048: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.145) 0:01:55.278 ***** 44071 1727204702.96221: entering _queue_task() for managed-node2/debug 44071 1727204702.97046: worker is 1 (out of 1 available) 44071 1727204702.97061: exiting _queue_task() for managed-node2/debug 44071 1727204702.97179: done queuing things up, now waiting for results queue to drain 44071 1727204702.97182: waiting for pending results... 44071 1727204702.97884: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204702.97998: in run() - task 127b8e07-fff9-c964-7471-000000001b4e 44071 1727204702.98272: variable 'ansible_search_path' from source: unknown 44071 1727204702.98277: variable 'ansible_search_path' from source: unknown 44071 1727204702.98286: calling self._execute() 44071 1727204702.98364: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204702.98672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204702.98676: variable 'omit' from source: magic vars 44071 1727204702.99245: variable 'ansible_distribution_major_version' from source: facts 44071 1727204702.99672: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204702.99676: variable 'omit' from source: magic vars 44071 1727204702.99680: variable 'omit' from source: magic vars 44071 1727204702.99682: variable 'omit' from source: magic vars 44071 1727204702.99684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204702.99899: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204702.99930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204702.99956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204702.99978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204703.00016: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204703.00372: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204703.00376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204703.00416: Set connection var ansible_connection to ssh 44071 1727204703.00432: Set connection var ansible_timeout to 10 44071 1727204703.00445: Set connection var ansible_pipelining to False 44071 1727204703.00455: Set connection var ansible_shell_type to sh 44071 1727204703.00465: Set connection var ansible_shell_executable to /bin/sh 44071 1727204703.00481: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204703.00514: variable 'ansible_shell_executable' from source: unknown 44071 1727204703.00521: variable 'ansible_connection' from source: unknown 44071 1727204703.00529: variable 'ansible_module_compression' from source: unknown 44071 1727204703.00536: variable 'ansible_shell_type' from source: unknown 44071 1727204703.00544: variable 'ansible_shell_executable' from source: unknown 44071 1727204703.00550: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204703.00559: variable 'ansible_pipelining' from source: unknown 44071 1727204703.00571: variable 'ansible_timeout' from source: unknown 44071 1727204703.00772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204703.01272: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204703.01276: variable 'omit' from source: magic vars 44071 1727204703.01279: starting attempt loop 44071 1727204703.01281: running the handler 44071 1727204703.01283: variable '__network_connections_result' from source: set_fact 44071 1727204703.01438: variable '__network_connections_result' from source: set_fact 44071 1727204703.01555: handler run complete 44071 1727204703.01726: attempt loop complete, returning result 44071 1727204703.01764: _execute() done 44071 1727204703.01789: dumping result to json 44071 1727204703.01799: done dumping result, returning 44071 1727204703.01836: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-c964-7471-000000001b4e] 44071 1727204703.01858: sending task result for task 127b8e07-fff9-c964-7471-000000001b4e ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 44071 1727204703.02130: no more pending results, returning what we have 44071 1727204703.02136: results queue empty 44071 1727204703.02137: checking for any_errors_fatal 44071 1727204703.02145: done checking for any_errors_fatal 44071 1727204703.02145: checking for max_fail_percentage 44071 1727204703.02147: done checking for max_fail_percentage 44071 1727204703.02148: checking to see if all hosts have failed and the running result is not ok 44071 1727204703.02149: done checking to see if all hosts have failed 44071 1727204703.02149: getting the remaining hosts for this loop 44071 1727204703.02151: done getting the remaining hosts for this loop 44071 1727204703.02155: getting the next task for host managed-node2 44071 1727204703.02164: done getting next task for host managed-node2 44071 1727204703.02170: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204703.02178: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204703.02196: getting variables 44071 1727204703.02198: in VariableManager get_vars() 44071 1727204703.02255: Calling all_inventory to load vars for managed-node2 44071 1727204703.02258: Calling groups_inventory to load vars for managed-node2 44071 1727204703.02260: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204703.02577: Calling all_plugins_play to load vars for managed-node2 44071 1727204703.02581: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204703.02588: done sending task result for task 127b8e07-fff9-c964-7471-000000001b4e 44071 1727204703.02591: WORKER PROCESS EXITING 44071 1727204703.02596: Calling groups_plugins_play to load vars for managed-node2 44071 1727204703.06262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204703.10930: done with get_vars() 44071 1727204703.10979: done getting variables 44071 1727204703.11050: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:05:03 -0400 (0:00:00.150) 0:01:55.429 ***** 44071 1727204703.11299: entering _queue_task() for managed-node2/debug 44071 1727204703.12047: worker is 1 (out of 1 available) 44071 1727204703.12063: exiting _queue_task() for managed-node2/debug 44071 1727204703.12179: done queuing things up, now waiting for results queue to drain 44071 1727204703.12181: waiting for pending results... 44071 1727204703.12353: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204703.12538: in run() - task 127b8e07-fff9-c964-7471-000000001b4f 44071 1727204703.12571: variable 'ansible_search_path' from source: unknown 44071 1727204703.12581: variable 'ansible_search_path' from source: unknown 44071 1727204703.12638: calling self._execute() 44071 1727204703.12774: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204703.12787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204703.12802: variable 'omit' from source: magic vars 44071 1727204703.13266: variable 'ansible_distribution_major_version' from source: facts 44071 1727204703.13293: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204703.13452: variable 'network_state' from source: role '' defaults 44071 1727204703.13470: Evaluated conditional (network_state != {}): False 44071 1727204703.13478: when evaluation is False, skipping this task 44071 1727204703.13486: _execute() done 44071 1727204703.13495: dumping result to json 44071 1727204703.13505: done dumping result, returning 44071 1727204703.13517: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-c964-7471-000000001b4f] 44071 1727204703.13531: sending task result for task 127b8e07-fff9-c964-7471-000000001b4f 44071 1727204703.13769: done sending task result for task 127b8e07-fff9-c964-7471-000000001b4f 44071 1727204703.13773: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 44071 1727204703.13828: no more pending results, returning what we have 44071 1727204703.13837: results queue empty 44071 1727204703.13838: checking for any_errors_fatal 44071 1727204703.13856: done checking for any_errors_fatal 44071 1727204703.13858: checking for max_fail_percentage 44071 1727204703.13859: done checking for max_fail_percentage 44071 1727204703.13861: checking to see if all hosts have failed and the running result is not ok 44071 1727204703.13862: done checking to see if all hosts have failed 44071 1727204703.13862: getting the remaining hosts for this loop 44071 1727204703.13864: done getting the remaining hosts for this loop 44071 1727204703.13872: getting the next task for host managed-node2 44071 1727204703.13884: done getting next task for host managed-node2 44071 1727204703.13889: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204703.13896: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204703.13936: getting variables 44071 1727204703.13939: in VariableManager get_vars() 44071 1727204703.14109: Calling all_inventory to load vars for managed-node2 44071 1727204703.14112: Calling groups_inventory to load vars for managed-node2 44071 1727204703.14115: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204703.14130: Calling all_plugins_play to load vars for managed-node2 44071 1727204703.14135: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204703.14139: Calling groups_plugins_play to load vars for managed-node2 44071 1727204703.16405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204703.20969: done with get_vars() 44071 1727204703.21017: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:05:03 -0400 (0:00:00.102) 0:01:55.532 ***** 44071 1727204703.21560: entering _queue_task() for managed-node2/ping 44071 1727204703.22417: worker is 1 (out of 1 available) 44071 1727204703.22436: exiting _queue_task() for managed-node2/ping 44071 1727204703.22451: done queuing things up, now waiting for results queue to drain 44071 1727204703.22453: waiting for pending results... 44071 1727204703.23218: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204703.23529: in run() - task 127b8e07-fff9-c964-7471-000000001b50 44071 1727204703.23536: variable 'ansible_search_path' from source: unknown 44071 1727204703.23539: variable 'ansible_search_path' from source: unknown 44071 1727204703.23672: calling self._execute() 44071 1727204703.23972: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204703.23976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204703.23979: variable 'omit' from source: magic vars 44071 1727204703.24890: variable 'ansible_distribution_major_version' from source: facts 44071 1727204703.24912: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204703.24924: variable 'omit' from source: magic vars 44071 1727204703.25172: variable 'omit' from source: magic vars 44071 1727204703.25176: variable 'omit' from source: magic vars 44071 1727204703.25320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204703.25370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204703.25442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204703.25672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204703.25676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204703.25679: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204703.25681: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204703.25684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204703.25780: Set connection var ansible_connection to ssh 44071 1727204703.25907: Set connection var ansible_timeout to 10 44071 1727204703.25918: Set connection var ansible_pipelining to False 44071 1727204703.25927: Set connection var ansible_shell_type to sh 44071 1727204703.25939: Set connection var ansible_shell_executable to /bin/sh 44071 1727204703.26071: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204703.26074: variable 'ansible_shell_executable' from source: unknown 44071 1727204703.26077: variable 'ansible_connection' from source: unknown 44071 1727204703.26079: variable 'ansible_module_compression' from source: unknown 44071 1727204703.26081: variable 'ansible_shell_type' from source: unknown 44071 1727204703.26084: variable 'ansible_shell_executable' from source: unknown 44071 1727204703.26086: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204703.26088: variable 'ansible_pipelining' from source: unknown 44071 1727204703.26090: variable 'ansible_timeout' from source: unknown 44071 1727204703.26113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204703.26772: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204703.26777: variable 'omit' from source: magic vars 44071 1727204703.26779: starting attempt loop 44071 1727204703.26782: running the handler 44071 1727204703.26784: _low_level_execute_command(): starting 44071 1727204703.26786: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204703.28395: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204703.28700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204703.28724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204703.28745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204703.28777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204703.29023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204703.30804: stdout chunk (state=3): >>>/root <<< 44071 1727204703.30971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204703.31117: stderr chunk (state=3): >>><<< 44071 1727204703.31190: stdout chunk (state=3): >>><<< 44071 1727204703.31213: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204703.31240: _low_level_execute_command(): starting 44071 1727204703.31282: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204703.3122303-50724-36640861156269 `" && echo ansible-tmp-1727204703.3122303-50724-36640861156269="` echo /root/.ansible/tmp/ansible-tmp-1727204703.3122303-50724-36640861156269 `" ) && sleep 0' 44071 1727204703.32590: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204703.32868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204703.32873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204703.33006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204703.33089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204703.35093: stdout chunk (state=3): >>>ansible-tmp-1727204703.3122303-50724-36640861156269=/root/.ansible/tmp/ansible-tmp-1727204703.3122303-50724-36640861156269 <<< 44071 1727204703.35357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204703.35364: stderr chunk (state=3): >>><<< 44071 1727204703.35370: stdout chunk (state=3): >>><<< 44071 1727204703.35373: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204703.3122303-50724-36640861156269=/root/.ansible/tmp/ansible-tmp-1727204703.3122303-50724-36640861156269 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204703.35674: variable 'ansible_module_compression' from source: unknown 44071 1727204703.35678: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44071 1727204703.35680: variable 'ansible_facts' from source: unknown 44071 1727204703.35767: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204703.3122303-50724-36640861156269/AnsiballZ_ping.py 44071 1727204703.36203: Sending initial data 44071 1727204703.36282: Sent initial data (152 bytes) 44071 1727204703.37640: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204703.37785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204703.37808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204703.37940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204703.39699: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204703.39785: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204703.39894: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmperibc60m /root/.ansible/tmp/ansible-tmp-1727204703.3122303-50724-36640861156269/AnsiballZ_ping.py <<< 44071 1727204703.39906: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204703.3122303-50724-36640861156269/AnsiballZ_ping.py" <<< 44071 1727204703.40006: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmperibc60m" to remote "/root/.ansible/tmp/ansible-tmp-1727204703.3122303-50724-36640861156269/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204703.3122303-50724-36640861156269/AnsiballZ_ping.py" <<< 44071 1727204703.42171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204703.42176: stdout chunk (state=3): >>><<< 44071 1727204703.42179: stderr chunk (state=3): >>><<< 44071 1727204703.42181: done transferring module to remote 44071 1727204703.42183: _low_level_execute_command(): starting 44071 1727204703.42186: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204703.3122303-50724-36640861156269/ /root/.ansible/tmp/ansible-tmp-1727204703.3122303-50724-36640861156269/AnsiballZ_ping.py && sleep 0' 44071 1727204703.43709: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204703.43727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204703.43937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204703.43941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204703.45976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204703.45980: stdout chunk (state=3): >>><<< 44071 1727204703.45983: stderr chunk (state=3): >>><<< 44071 1727204703.46193: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204703.46204: _low_level_execute_command(): starting 44071 1727204703.46207: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204703.3122303-50724-36640861156269/AnsiballZ_ping.py && sleep 0' 44071 1727204703.47356: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204703.47687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204703.47711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204703.47975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204703.64336: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44071 1727204703.65828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204703.65836: stdout chunk (state=3): >>><<< 44071 1727204703.65839: stderr chunk (state=3): >>><<< 44071 1727204703.65859: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204703.65889: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204703.3122303-50724-36640861156269/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204703.65898: _low_level_execute_command(): starting 44071 1727204703.65904: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204703.3122303-50724-36640861156269/ > /dev/null 2>&1 && sleep 0' 44071 1727204703.67384: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204703.67489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204703.67501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204703.67524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204703.67539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204703.67543: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204703.67545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204703.67644: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204703.67647: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204703.67649: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204703.67651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204703.68096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204703.69875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204703.69949: stderr chunk (state=3): >>><<< 44071 1727204703.69959: stdout chunk (state=3): >>><<< 44071 1727204703.69986: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204703.69998: handler run complete 44071 1727204703.70020: attempt loop complete, returning result 44071 1727204703.70056: _execute() done 44071 1727204703.70064: dumping result to json 44071 1727204703.70274: done dumping result, returning 44071 1727204703.70277: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-c964-7471-000000001b50] 44071 1727204703.70280: sending task result for task 127b8e07-fff9-c964-7471-000000001b50 44071 1727204703.70360: done sending task result for task 127b8e07-fff9-c964-7471-000000001b50 44071 1727204703.70370: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 44071 1727204703.70467: no more pending results, returning what we have 44071 1727204703.70472: results queue empty 44071 1727204703.70473: checking for any_errors_fatal 44071 1727204703.70481: done checking for any_errors_fatal 44071 1727204703.70482: checking for max_fail_percentage 44071 1727204703.70483: done checking for max_fail_percentage 44071 1727204703.70484: checking to see if all hosts have failed and the running result is not ok 44071 1727204703.70485: done checking to see if all hosts have failed 44071 1727204703.70486: getting the remaining hosts for this loop 44071 1727204703.70487: done getting the remaining hosts for this loop 44071 1727204703.70492: getting the next task for host managed-node2 44071 1727204703.70506: done getting next task for host managed-node2 44071 1727204703.70510: ^ task is: TASK: meta (role_complete) 44071 1727204703.70516: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204703.70535: getting variables 44071 1727204703.70537: in VariableManager get_vars() 44071 1727204703.70996: Calling all_inventory to load vars for managed-node2 44071 1727204703.71000: Calling groups_inventory to load vars for managed-node2 44071 1727204703.71002: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204703.71015: Calling all_plugins_play to load vars for managed-node2 44071 1727204703.71019: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204703.71022: Calling groups_plugins_play to load vars for managed-node2 44071 1727204703.74872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204703.78994: done with get_vars() 44071 1727204703.79028: done getting variables 44071 1727204703.79139: done queuing things up, now waiting for results queue to drain 44071 1727204703.79141: results queue empty 44071 1727204703.79142: checking for any_errors_fatal 44071 1727204703.79145: done checking for any_errors_fatal 44071 1727204703.79146: checking for max_fail_percentage 44071 1727204703.79147: done checking for max_fail_percentage 44071 1727204703.79148: checking to see if all hosts have failed and the running result is not ok 44071 1727204703.79149: done checking to see if all hosts have failed 44071 1727204703.79150: getting the remaining hosts for this loop 44071 1727204703.79151: done getting the remaining hosts for this loop 44071 1727204703.79153: getting the next task for host managed-node2 44071 1727204703.79160: done getting next task for host managed-node2 44071 1727204703.79162: ^ task is: TASK: Test 44071 1727204703.79165: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204703.79171: getting variables 44071 1727204703.79172: in VariableManager get_vars() 44071 1727204703.79186: Calling all_inventory to load vars for managed-node2 44071 1727204703.79188: Calling groups_inventory to load vars for managed-node2 44071 1727204703.79191: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204703.79197: Calling all_plugins_play to load vars for managed-node2 44071 1727204703.79199: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204703.79206: Calling groups_plugins_play to load vars for managed-node2 44071 1727204703.81012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204703.84278: done with get_vars() 44071 1727204703.84325: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 15:05:03 -0400 (0:00:00.628) 0:01:56.160 ***** 44071 1727204703.84432: entering _queue_task() for managed-node2/include_tasks 44071 1727204703.84878: worker is 1 (out of 1 available) 44071 1727204703.84898: exiting _queue_task() for managed-node2/include_tasks 44071 1727204703.84915: done queuing things up, now waiting for results queue to drain 44071 1727204703.84917: waiting for pending results... 44071 1727204703.85191: running TaskExecutor() for managed-node2/TASK: Test 44071 1727204703.85343: in run() - task 127b8e07-fff9-c964-7471-000000001748 44071 1727204703.85397: variable 'ansible_search_path' from source: unknown 44071 1727204703.85401: variable 'ansible_search_path' from source: unknown 44071 1727204703.85435: variable 'lsr_test' from source: include params 44071 1727204703.86040: variable 'lsr_test' from source: include params 44071 1727204703.86046: variable 'omit' from source: magic vars 44071 1727204703.86210: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204703.86229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204703.86247: variable 'omit' from source: magic vars 44071 1727204703.86548: variable 'ansible_distribution_major_version' from source: facts 44071 1727204703.86569: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204703.86582: variable 'item' from source: unknown 44071 1727204703.86659: variable 'item' from source: unknown 44071 1727204703.86703: variable 'item' from source: unknown 44071 1727204703.86779: variable 'item' from source: unknown 44071 1727204703.87075: dumping result to json 44071 1727204703.87078: done dumping result, returning 44071 1727204703.87081: done running TaskExecutor() for managed-node2/TASK: Test [127b8e07-fff9-c964-7471-000000001748] 44071 1727204703.87083: sending task result for task 127b8e07-fff9-c964-7471-000000001748 44071 1727204703.87135: done sending task result for task 127b8e07-fff9-c964-7471-000000001748 44071 1727204703.87138: WORKER PROCESS EXITING 44071 1727204703.87204: no more pending results, returning what we have 44071 1727204703.87211: in VariableManager get_vars() 44071 1727204703.87272: Calling all_inventory to load vars for managed-node2 44071 1727204703.87276: Calling groups_inventory to load vars for managed-node2 44071 1727204703.87280: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204703.87298: Calling all_plugins_play to load vars for managed-node2 44071 1727204703.87302: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204703.87305: Calling groups_plugins_play to load vars for managed-node2 44071 1727204703.89698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204703.93267: done with get_vars() 44071 1727204703.93315: variable 'ansible_search_path' from source: unknown 44071 1727204703.93317: variable 'ansible_search_path' from source: unknown 44071 1727204703.93370: we have included files to process 44071 1727204703.93371: generating all_blocks data 44071 1727204703.93374: done generating all_blocks data 44071 1727204703.93380: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 44071 1727204703.93382: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 44071 1727204703.93385: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 44071 1727204703.93631: done processing included file 44071 1727204703.93633: iterating over new_blocks loaded from include file 44071 1727204703.93635: in VariableManager get_vars() 44071 1727204703.93657: done with get_vars() 44071 1727204703.93660: filtering new block on tags 44071 1727204703.93692: done filtering new block on tags 44071 1727204703.93695: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed-node2 => (item=tasks/remove+down_profile.yml) 44071 1727204703.93701: extending task lists for all hosts with included blocks 44071 1727204703.95455: done extending task lists 44071 1727204703.95457: done processing included files 44071 1727204703.95458: results queue empty 44071 1727204703.95459: checking for any_errors_fatal 44071 1727204703.95461: done checking for any_errors_fatal 44071 1727204703.95461: checking for max_fail_percentage 44071 1727204703.95463: done checking for max_fail_percentage 44071 1727204703.95464: checking to see if all hosts have failed and the running result is not ok 44071 1727204703.95464: done checking to see if all hosts have failed 44071 1727204703.95468: getting the remaining hosts for this loop 44071 1727204703.95469: done getting the remaining hosts for this loop 44071 1727204703.95472: getting the next task for host managed-node2 44071 1727204703.95478: done getting next task for host managed-node2 44071 1727204703.95480: ^ task is: TASK: Include network role 44071 1727204703.95484: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204703.95486: getting variables 44071 1727204703.95488: in VariableManager get_vars() 44071 1727204703.95506: Calling all_inventory to load vars for managed-node2 44071 1727204703.95508: Calling groups_inventory to load vars for managed-node2 44071 1727204703.95511: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204703.95518: Calling all_plugins_play to load vars for managed-node2 44071 1727204703.95521: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204703.95524: Calling groups_plugins_play to load vars for managed-node2 44071 1727204703.97453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204703.99371: done with get_vars() 44071 1727204703.99403: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Tuesday 24 September 2024 15:05:03 -0400 (0:00:00.150) 0:01:56.311 ***** 44071 1727204703.99508: entering _queue_task() for managed-node2/include_role 44071 1727204703.99933: worker is 1 (out of 1 available) 44071 1727204703.99946: exiting _queue_task() for managed-node2/include_role 44071 1727204703.99962: done queuing things up, now waiting for results queue to drain 44071 1727204703.99964: waiting for pending results... 44071 1727204704.00388: running TaskExecutor() for managed-node2/TASK: Include network role 44071 1727204704.00454: in run() - task 127b8e07-fff9-c964-7471-000000001ca9 44071 1727204704.00483: variable 'ansible_search_path' from source: unknown 44071 1727204704.00492: variable 'ansible_search_path' from source: unknown 44071 1727204704.00546: calling self._execute() 44071 1727204704.00663: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204704.00682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204704.00692: variable 'omit' from source: magic vars 44071 1727204704.01049: variable 'ansible_distribution_major_version' from source: facts 44071 1727204704.01053: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204704.01057: _execute() done 44071 1727204704.01063: dumping result to json 44071 1727204704.01068: done dumping result, returning 44071 1727204704.01074: done running TaskExecutor() for managed-node2/TASK: Include network role [127b8e07-fff9-c964-7471-000000001ca9] 44071 1727204704.01079: sending task result for task 127b8e07-fff9-c964-7471-000000001ca9 44071 1727204704.01228: done sending task result for task 127b8e07-fff9-c964-7471-000000001ca9 44071 1727204704.01232: WORKER PROCESS EXITING 44071 1727204704.01304: no more pending results, returning what we have 44071 1727204704.01309: in VariableManager get_vars() 44071 1727204704.01360: Calling all_inventory to load vars for managed-node2 44071 1727204704.01363: Calling groups_inventory to load vars for managed-node2 44071 1727204704.01369: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204704.01383: Calling all_plugins_play to load vars for managed-node2 44071 1727204704.01385: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204704.01388: Calling groups_plugins_play to load vars for managed-node2 44071 1727204704.03229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204704.04474: done with get_vars() 44071 1727204704.04507: variable 'ansible_search_path' from source: unknown 44071 1727204704.04508: variable 'ansible_search_path' from source: unknown 44071 1727204704.04613: variable 'omit' from source: magic vars 44071 1727204704.04645: variable 'omit' from source: magic vars 44071 1727204704.04656: variable 'omit' from source: magic vars 44071 1727204704.04658: we have included files to process 44071 1727204704.04659: generating all_blocks data 44071 1727204704.04660: done generating all_blocks data 44071 1727204704.04662: processing included file: fedora.linux_system_roles.network 44071 1727204704.04679: in VariableManager get_vars() 44071 1727204704.04697: done with get_vars() 44071 1727204704.04726: in VariableManager get_vars() 44071 1727204704.04750: done with get_vars() 44071 1727204704.04794: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44071 1727204704.04916: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44071 1727204704.05004: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44071 1727204704.05540: in VariableManager get_vars() 44071 1727204704.05562: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204704.07443: iterating over new_blocks loaded from include file 44071 1727204704.07445: in VariableManager get_vars() 44071 1727204704.07462: done with get_vars() 44071 1727204704.07463: filtering new block on tags 44071 1727204704.07712: done filtering new block on tags 44071 1727204704.07716: in VariableManager get_vars() 44071 1727204704.07728: done with get_vars() 44071 1727204704.07729: filtering new block on tags 44071 1727204704.07742: done filtering new block on tags 44071 1727204704.07744: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 44071 1727204704.07748: extending task lists for all hosts with included blocks 44071 1727204704.07825: done extending task lists 44071 1727204704.07826: done processing included files 44071 1727204704.07827: results queue empty 44071 1727204704.07827: checking for any_errors_fatal 44071 1727204704.07830: done checking for any_errors_fatal 44071 1727204704.07831: checking for max_fail_percentage 44071 1727204704.07831: done checking for max_fail_percentage 44071 1727204704.07832: checking to see if all hosts have failed and the running result is not ok 44071 1727204704.07833: done checking to see if all hosts have failed 44071 1727204704.07833: getting the remaining hosts for this loop 44071 1727204704.07835: done getting the remaining hosts for this loop 44071 1727204704.07837: getting the next task for host managed-node2 44071 1727204704.07840: done getting next task for host managed-node2 44071 1727204704.07842: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204704.07844: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204704.07853: getting variables 44071 1727204704.07854: in VariableManager get_vars() 44071 1727204704.07864: Calling all_inventory to load vars for managed-node2 44071 1727204704.07868: Calling groups_inventory to load vars for managed-node2 44071 1727204704.07869: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204704.07874: Calling all_plugins_play to load vars for managed-node2 44071 1727204704.07876: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204704.07877: Calling groups_plugins_play to load vars for managed-node2 44071 1727204704.08790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204704.10568: done with get_vars() 44071 1727204704.10601: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:05:04 -0400 (0:00:00.111) 0:01:56.423 ***** 44071 1727204704.10673: entering _queue_task() for managed-node2/include_tasks 44071 1727204704.10983: worker is 1 (out of 1 available) 44071 1727204704.11000: exiting _queue_task() for managed-node2/include_tasks 44071 1727204704.11015: done queuing things up, now waiting for results queue to drain 44071 1727204704.11016: waiting for pending results... 44071 1727204704.11233: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204704.11343: in run() - task 127b8e07-fff9-c964-7471-000000001d2b 44071 1727204704.11358: variable 'ansible_search_path' from source: unknown 44071 1727204704.11363: variable 'ansible_search_path' from source: unknown 44071 1727204704.11403: calling self._execute() 44071 1727204704.11498: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204704.11504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204704.11513: variable 'omit' from source: magic vars 44071 1727204704.11857: variable 'ansible_distribution_major_version' from source: facts 44071 1727204704.11869: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204704.11877: _execute() done 44071 1727204704.11881: dumping result to json 44071 1727204704.11884: done dumping result, returning 44071 1727204704.11893: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-c964-7471-000000001d2b] 44071 1727204704.11897: sending task result for task 127b8e07-fff9-c964-7471-000000001d2b 44071 1727204704.12004: done sending task result for task 127b8e07-fff9-c964-7471-000000001d2b 44071 1727204704.12009: WORKER PROCESS EXITING 44071 1727204704.12075: no more pending results, returning what we have 44071 1727204704.12082: in VariableManager get_vars() 44071 1727204704.12140: Calling all_inventory to load vars for managed-node2 44071 1727204704.12143: Calling groups_inventory to load vars for managed-node2 44071 1727204704.12146: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204704.12159: Calling all_plugins_play to load vars for managed-node2 44071 1727204704.12162: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204704.12172: Calling groups_plugins_play to load vars for managed-node2 44071 1727204704.13403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204704.14625: done with get_vars() 44071 1727204704.14656: variable 'ansible_search_path' from source: unknown 44071 1727204704.14657: variable 'ansible_search_path' from source: unknown 44071 1727204704.14694: we have included files to process 44071 1727204704.14695: generating all_blocks data 44071 1727204704.14696: done generating all_blocks data 44071 1727204704.14699: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204704.14700: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204704.14701: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204704.15138: done processing included file 44071 1727204704.15140: iterating over new_blocks loaded from include file 44071 1727204704.15141: in VariableManager get_vars() 44071 1727204704.15163: done with get_vars() 44071 1727204704.15164: filtering new block on tags 44071 1727204704.15189: done filtering new block on tags 44071 1727204704.15192: in VariableManager get_vars() 44071 1727204704.15207: done with get_vars() 44071 1727204704.15208: filtering new block on tags 44071 1727204704.15243: done filtering new block on tags 44071 1727204704.15245: in VariableManager get_vars() 44071 1727204704.15262: done with get_vars() 44071 1727204704.15263: filtering new block on tags 44071 1727204704.15291: done filtering new block on tags 44071 1727204704.15292: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 44071 1727204704.15297: extending task lists for all hosts with included blocks 44071 1727204704.16437: done extending task lists 44071 1727204704.16439: done processing included files 44071 1727204704.16440: results queue empty 44071 1727204704.16440: checking for any_errors_fatal 44071 1727204704.16443: done checking for any_errors_fatal 44071 1727204704.16443: checking for max_fail_percentage 44071 1727204704.16444: done checking for max_fail_percentage 44071 1727204704.16445: checking to see if all hosts have failed and the running result is not ok 44071 1727204704.16445: done checking to see if all hosts have failed 44071 1727204704.16446: getting the remaining hosts for this loop 44071 1727204704.16447: done getting the remaining hosts for this loop 44071 1727204704.16449: getting the next task for host managed-node2 44071 1727204704.16453: done getting next task for host managed-node2 44071 1727204704.16455: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204704.16459: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204704.16470: getting variables 44071 1727204704.16471: in VariableManager get_vars() 44071 1727204704.16487: Calling all_inventory to load vars for managed-node2 44071 1727204704.16488: Calling groups_inventory to load vars for managed-node2 44071 1727204704.16490: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204704.16494: Calling all_plugins_play to load vars for managed-node2 44071 1727204704.16496: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204704.16498: Calling groups_plugins_play to load vars for managed-node2 44071 1727204704.17453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204704.18659: done with get_vars() 44071 1727204704.18693: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:05:04 -0400 (0:00:00.080) 0:01:56.504 ***** 44071 1727204704.18762: entering _queue_task() for managed-node2/setup 44071 1727204704.19073: worker is 1 (out of 1 available) 44071 1727204704.19089: exiting _queue_task() for managed-node2/setup 44071 1727204704.19104: done queuing things up, now waiting for results queue to drain 44071 1727204704.19106: waiting for pending results... 44071 1727204704.19325: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204704.19449: in run() - task 127b8e07-fff9-c964-7471-000000001d82 44071 1727204704.19464: variable 'ansible_search_path' from source: unknown 44071 1727204704.19469: variable 'ansible_search_path' from source: unknown 44071 1727204704.19505: calling self._execute() 44071 1727204704.19588: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204704.19594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204704.19603: variable 'omit' from source: magic vars 44071 1727204704.19923: variable 'ansible_distribution_major_version' from source: facts 44071 1727204704.19933: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204704.20105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204704.21832: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204704.21886: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204704.21917: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204704.21947: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204704.21970: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204704.22043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204704.22067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204704.22091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204704.22123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204704.22139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204704.22186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204704.22203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204704.22223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204704.22254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204704.22265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204704.22396: variable '__network_required_facts' from source: role '' defaults 44071 1727204704.22405: variable 'ansible_facts' from source: unknown 44071 1727204704.23040: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44071 1727204704.23045: when evaluation is False, skipping this task 44071 1727204704.23048: _execute() done 44071 1727204704.23051: dumping result to json 44071 1727204704.23053: done dumping result, returning 44071 1727204704.23059: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-c964-7471-000000001d82] 44071 1727204704.23063: sending task result for task 127b8e07-fff9-c964-7471-000000001d82 44071 1727204704.23171: done sending task result for task 127b8e07-fff9-c964-7471-000000001d82 44071 1727204704.23174: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204704.23227: no more pending results, returning what we have 44071 1727204704.23231: results queue empty 44071 1727204704.23232: checking for any_errors_fatal 44071 1727204704.23236: done checking for any_errors_fatal 44071 1727204704.23237: checking for max_fail_percentage 44071 1727204704.23238: done checking for max_fail_percentage 44071 1727204704.23239: checking to see if all hosts have failed and the running result is not ok 44071 1727204704.23240: done checking to see if all hosts have failed 44071 1727204704.23241: getting the remaining hosts for this loop 44071 1727204704.23242: done getting the remaining hosts for this loop 44071 1727204704.23248: getting the next task for host managed-node2 44071 1727204704.23261: done getting next task for host managed-node2 44071 1727204704.23267: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204704.23273: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204704.23307: getting variables 44071 1727204704.23309: in VariableManager get_vars() 44071 1727204704.23358: Calling all_inventory to load vars for managed-node2 44071 1727204704.23361: Calling groups_inventory to load vars for managed-node2 44071 1727204704.23363: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204704.23376: Calling all_plugins_play to load vars for managed-node2 44071 1727204704.23379: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204704.23389: Calling groups_plugins_play to load vars for managed-node2 44071 1727204704.24487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204704.25698: done with get_vars() 44071 1727204704.25728: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:05:04 -0400 (0:00:00.070) 0:01:56.574 ***** 44071 1727204704.25818: entering _queue_task() for managed-node2/stat 44071 1727204704.26118: worker is 1 (out of 1 available) 44071 1727204704.26134: exiting _queue_task() for managed-node2/stat 44071 1727204704.26149: done queuing things up, now waiting for results queue to drain 44071 1727204704.26151: waiting for pending results... 44071 1727204704.26368: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204704.26497: in run() - task 127b8e07-fff9-c964-7471-000000001d84 44071 1727204704.26512: variable 'ansible_search_path' from source: unknown 44071 1727204704.26515: variable 'ansible_search_path' from source: unknown 44071 1727204704.26553: calling self._execute() 44071 1727204704.26646: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204704.26650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204704.26660: variable 'omit' from source: magic vars 44071 1727204704.26981: variable 'ansible_distribution_major_version' from source: facts 44071 1727204704.26992: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204704.27125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204704.27378: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204704.27414: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204704.27443: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204704.27469: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204704.27544: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204704.27562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204704.27587: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204704.27606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204704.27684: variable '__network_is_ostree' from source: set_fact 44071 1727204704.27694: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204704.27699: when evaluation is False, skipping this task 44071 1727204704.27702: _execute() done 44071 1727204704.27705: dumping result to json 44071 1727204704.27707: done dumping result, returning 44071 1727204704.27715: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-c964-7471-000000001d84] 44071 1727204704.27717: sending task result for task 127b8e07-fff9-c964-7471-000000001d84 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204704.27881: no more pending results, returning what we have 44071 1727204704.27885: results queue empty 44071 1727204704.27886: checking for any_errors_fatal 44071 1727204704.27896: done checking for any_errors_fatal 44071 1727204704.27896: checking for max_fail_percentage 44071 1727204704.27898: done checking for max_fail_percentage 44071 1727204704.27899: checking to see if all hosts have failed and the running result is not ok 44071 1727204704.27900: done checking to see if all hosts have failed 44071 1727204704.27900: getting the remaining hosts for this loop 44071 1727204704.27902: done getting the remaining hosts for this loop 44071 1727204704.27907: getting the next task for host managed-node2 44071 1727204704.27917: done getting next task for host managed-node2 44071 1727204704.27921: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204704.27929: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204704.27954: getting variables 44071 1727204704.27956: in VariableManager get_vars() 44071 1727204704.28005: Calling all_inventory to load vars for managed-node2 44071 1727204704.28007: Calling groups_inventory to load vars for managed-node2 44071 1727204704.28009: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204704.28021: Calling all_plugins_play to load vars for managed-node2 44071 1727204704.28024: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204704.28027: Calling groups_plugins_play to load vars for managed-node2 44071 1727204704.28771: done sending task result for task 127b8e07-fff9-c964-7471-000000001d84 44071 1727204704.28775: WORKER PROCESS EXITING 44071 1727204704.35600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204704.37812: done with get_vars() 44071 1727204704.37854: done getting variables 44071 1727204704.37912: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:05:04 -0400 (0:00:00.121) 0:01:56.695 ***** 44071 1727204704.37950: entering _queue_task() for managed-node2/set_fact 44071 1727204704.38356: worker is 1 (out of 1 available) 44071 1727204704.38372: exiting _queue_task() for managed-node2/set_fact 44071 1727204704.38389: done queuing things up, now waiting for results queue to drain 44071 1727204704.38391: waiting for pending results... 44071 1727204704.38750: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204704.38941: in run() - task 127b8e07-fff9-c964-7471-000000001d85 44071 1727204704.39174: variable 'ansible_search_path' from source: unknown 44071 1727204704.39180: variable 'ansible_search_path' from source: unknown 44071 1727204704.39184: calling self._execute() 44071 1727204704.39188: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204704.39194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204704.39197: variable 'omit' from source: magic vars 44071 1727204704.39716: variable 'ansible_distribution_major_version' from source: facts 44071 1727204704.39729: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204704.39941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204704.40272: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204704.40335: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204704.40447: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204704.40488: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204704.40595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204704.40629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204704.40657: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204704.40687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204704.40838: variable '__network_is_ostree' from source: set_fact 44071 1727204704.40842: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204704.40845: when evaluation is False, skipping this task 44071 1727204704.40848: _execute() done 44071 1727204704.40850: dumping result to json 44071 1727204704.40853: done dumping result, returning 44071 1727204704.40856: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-000000001d85] 44071 1727204704.40858: sending task result for task 127b8e07-fff9-c964-7471-000000001d85 44071 1727204704.41017: done sending task result for task 127b8e07-fff9-c964-7471-000000001d85 44071 1727204704.41020: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204704.41102: no more pending results, returning what we have 44071 1727204704.41106: results queue empty 44071 1727204704.41107: checking for any_errors_fatal 44071 1727204704.41119: done checking for any_errors_fatal 44071 1727204704.41120: checking for max_fail_percentage 44071 1727204704.41122: done checking for max_fail_percentage 44071 1727204704.41123: checking to see if all hosts have failed and the running result is not ok 44071 1727204704.41124: done checking to see if all hosts have failed 44071 1727204704.41125: getting the remaining hosts for this loop 44071 1727204704.41126: done getting the remaining hosts for this loop 44071 1727204704.41132: getting the next task for host managed-node2 44071 1727204704.41144: done getting next task for host managed-node2 44071 1727204704.41150: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204704.41158: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204704.41192: getting variables 44071 1727204704.41194: in VariableManager get_vars() 44071 1727204704.41247: Calling all_inventory to load vars for managed-node2 44071 1727204704.41250: Calling groups_inventory to load vars for managed-node2 44071 1727204704.41253: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204704.41527: Calling all_plugins_play to load vars for managed-node2 44071 1727204704.41533: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204704.41538: Calling groups_plugins_play to load vars for managed-node2 44071 1727204704.43509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204704.45926: done with get_vars() 44071 1727204704.45964: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:05:04 -0400 (0:00:00.081) 0:01:56.777 ***** 44071 1727204704.46087: entering _queue_task() for managed-node2/service_facts 44071 1727204704.46501: worker is 1 (out of 1 available) 44071 1727204704.46515: exiting _queue_task() for managed-node2/service_facts 44071 1727204704.46530: done queuing things up, now waiting for results queue to drain 44071 1727204704.46531: waiting for pending results... 44071 1727204704.46925: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204704.47274: in run() - task 127b8e07-fff9-c964-7471-000000001d87 44071 1727204704.47278: variable 'ansible_search_path' from source: unknown 44071 1727204704.47281: variable 'ansible_search_path' from source: unknown 44071 1727204704.47283: calling self._execute() 44071 1727204704.47287: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204704.47290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204704.47292: variable 'omit' from source: magic vars 44071 1727204704.47703: variable 'ansible_distribution_major_version' from source: facts 44071 1727204704.47713: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204704.47720: variable 'omit' from source: magic vars 44071 1727204704.47942: variable 'omit' from source: magic vars 44071 1727204704.47985: variable 'omit' from source: magic vars 44071 1727204704.48085: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204704.48149: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204704.48272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204704.48276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204704.48279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204704.48373: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204704.48378: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204704.48381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204704.48615: Set connection var ansible_connection to ssh 44071 1727204704.48623: Set connection var ansible_timeout to 10 44071 1727204704.48629: Set connection var ansible_pipelining to False 44071 1727204704.48638: Set connection var ansible_shell_type to sh 44071 1727204704.48641: Set connection var ansible_shell_executable to /bin/sh 44071 1727204704.48653: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204704.48803: variable 'ansible_shell_executable' from source: unknown 44071 1727204704.48807: variable 'ansible_connection' from source: unknown 44071 1727204704.48811: variable 'ansible_module_compression' from source: unknown 44071 1727204704.48813: variable 'ansible_shell_type' from source: unknown 44071 1727204704.48817: variable 'ansible_shell_executable' from source: unknown 44071 1727204704.48819: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204704.48822: variable 'ansible_pipelining' from source: unknown 44071 1727204704.48825: variable 'ansible_timeout' from source: unknown 44071 1727204704.48830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204704.49342: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204704.49356: variable 'omit' from source: magic vars 44071 1727204704.49362: starting attempt loop 44071 1727204704.49367: running the handler 44071 1727204704.49385: _low_level_execute_command(): starting 44071 1727204704.49398: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204704.51604: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204704.51612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204704.51686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204704.51715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204704.51832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204704.53611: stdout chunk (state=3): >>>/root <<< 44071 1727204704.53897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204704.53939: stderr chunk (state=3): >>><<< 44071 1727204704.54023: stdout chunk (state=3): >>><<< 44071 1727204704.54044: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204704.54070: _low_level_execute_command(): starting 44071 1727204704.54122: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204704.5405247-50770-213709871652320 `" && echo ansible-tmp-1727204704.5405247-50770-213709871652320="` echo /root/.ansible/tmp/ansible-tmp-1727204704.5405247-50770-213709871652320 `" ) && sleep 0' 44071 1727204704.54813: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204704.54832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204704.54907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204704.54911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204704.54932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204704.55038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204704.57227: stdout chunk (state=3): >>>ansible-tmp-1727204704.5405247-50770-213709871652320=/root/.ansible/tmp/ansible-tmp-1727204704.5405247-50770-213709871652320 <<< 44071 1727204704.57604: stdout chunk (state=3): >>><<< 44071 1727204704.57608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204704.57610: stderr chunk (state=3): >>><<< 44071 1727204704.57613: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204704.5405247-50770-213709871652320=/root/.ansible/tmp/ansible-tmp-1727204704.5405247-50770-213709871652320 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204704.57617: variable 'ansible_module_compression' from source: unknown 44071 1727204704.57619: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44071 1727204704.57621: variable 'ansible_facts' from source: unknown 44071 1727204704.57698: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204704.5405247-50770-213709871652320/AnsiballZ_service_facts.py 44071 1727204704.57943: Sending initial data 44071 1727204704.57953: Sent initial data (162 bytes) 44071 1727204704.58504: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204704.58522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204704.58539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204704.58582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204704.58595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204704.58608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204704.58692: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204704.58716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204704.58816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204704.60430: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204704.60531: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204704.60605: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp_8e9baw4 /root/.ansible/tmp/ansible-tmp-1727204704.5405247-50770-213709871652320/AnsiballZ_service_facts.py <<< 44071 1727204704.60617: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204704.5405247-50770-213709871652320/AnsiballZ_service_facts.py" <<< 44071 1727204704.60675: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp_8e9baw4" to remote "/root/.ansible/tmp/ansible-tmp-1727204704.5405247-50770-213709871652320/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204704.5405247-50770-213709871652320/AnsiballZ_service_facts.py" <<< 44071 1727204704.61637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204704.61707: stderr chunk (state=3): >>><<< 44071 1727204704.61722: stdout chunk (state=3): >>><<< 44071 1727204704.61755: done transferring module to remote 44071 1727204704.61775: _low_level_execute_command(): starting 44071 1727204704.61786: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204704.5405247-50770-213709871652320/ /root/.ansible/tmp/ansible-tmp-1727204704.5405247-50770-213709871652320/AnsiballZ_service_facts.py && sleep 0' 44071 1727204704.62484: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204704.62581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204704.62588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204704.62624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204704.62652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204704.62672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204704.62783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204704.64738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204704.64758: stdout chunk (state=3): >>><<< 44071 1727204704.64774: stderr chunk (state=3): >>><<< 44071 1727204704.64882: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204704.64886: _low_level_execute_command(): starting 44071 1727204704.64889: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204704.5405247-50770-213709871652320/AnsiballZ_service_facts.py && sleep 0' 44071 1727204704.65461: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204704.65481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204704.65495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204704.65519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204704.65622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204704.65647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204704.65760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204706.92288: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "stati<<< 44071 1727204706.92315: stdout chunk (state=3): >>>c", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev<<< 44071 1727204706.92354: stdout chunk (state=3): >>>-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44071 1727204706.93931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204706.93995: stderr chunk (state=3): >>><<< 44071 1727204706.93999: stdout chunk (state=3): >>><<< 44071 1727204706.94031: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204706.94639: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204704.5405247-50770-213709871652320/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204706.94644: _low_level_execute_command(): starting 44071 1727204706.94650: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204704.5405247-50770-213709871652320/ > /dev/null 2>&1 && sleep 0' 44071 1727204706.95163: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204706.95170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204706.95174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204706.95176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204706.95229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204706.95232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204706.95235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204706.95312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204706.97211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204706.97281: stderr chunk (state=3): >>><<< 44071 1727204706.97285: stdout chunk (state=3): >>><<< 44071 1727204706.97294: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204706.97301: handler run complete 44071 1727204706.97453: variable 'ansible_facts' from source: unknown 44071 1727204706.97593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204706.97942: variable 'ansible_facts' from source: unknown 44071 1727204706.98062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204706.98218: attempt loop complete, returning result 44071 1727204706.98224: _execute() done 44071 1727204706.98227: dumping result to json 44071 1727204706.98273: done dumping result, returning 44071 1727204706.98284: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-c964-7471-000000001d87] 44071 1727204706.98287: sending task result for task 127b8e07-fff9-c964-7471-000000001d87 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204706.99111: done sending task result for task 127b8e07-fff9-c964-7471-000000001d87 44071 1727204706.99115: WORKER PROCESS EXITING 44071 1727204706.99129: no more pending results, returning what we have 44071 1727204706.99132: results queue empty 44071 1727204706.99135: checking for any_errors_fatal 44071 1727204706.99138: done checking for any_errors_fatal 44071 1727204706.99139: checking for max_fail_percentage 44071 1727204706.99140: done checking for max_fail_percentage 44071 1727204706.99141: checking to see if all hosts have failed and the running result is not ok 44071 1727204706.99141: done checking to see if all hosts have failed 44071 1727204706.99142: getting the remaining hosts for this loop 44071 1727204706.99142: done getting the remaining hosts for this loop 44071 1727204706.99145: getting the next task for host managed-node2 44071 1727204706.99151: done getting next task for host managed-node2 44071 1727204706.99153: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204706.99158: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204706.99169: getting variables 44071 1727204706.99171: in VariableManager get_vars() 44071 1727204706.99201: Calling all_inventory to load vars for managed-node2 44071 1727204706.99203: Calling groups_inventory to load vars for managed-node2 44071 1727204706.99204: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204706.99213: Calling all_plugins_play to load vars for managed-node2 44071 1727204706.99214: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204706.99222: Calling groups_plugins_play to load vars for managed-node2 44071 1727204707.00307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204707.01561: done with get_vars() 44071 1727204707.01598: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:05:07 -0400 (0:00:02.556) 0:01:59.333 ***** 44071 1727204707.01686: entering _queue_task() for managed-node2/package_facts 44071 1727204707.01997: worker is 1 (out of 1 available) 44071 1727204707.02013: exiting _queue_task() for managed-node2/package_facts 44071 1727204707.02028: done queuing things up, now waiting for results queue to drain 44071 1727204707.02030: waiting for pending results... 44071 1727204707.02245: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204707.02386: in run() - task 127b8e07-fff9-c964-7471-000000001d88 44071 1727204707.02400: variable 'ansible_search_path' from source: unknown 44071 1727204707.02404: variable 'ansible_search_path' from source: unknown 44071 1727204707.02472: calling self._execute() 44071 1727204707.02527: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204707.02532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204707.02543: variable 'omit' from source: magic vars 44071 1727204707.02882: variable 'ansible_distribution_major_version' from source: facts 44071 1727204707.02893: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204707.02899: variable 'omit' from source: magic vars 44071 1727204707.02964: variable 'omit' from source: magic vars 44071 1727204707.02991: variable 'omit' from source: magic vars 44071 1727204707.03030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204707.03062: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204707.03081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204707.03096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204707.03108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204707.03133: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204707.03145: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204707.03148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204707.03225: Set connection var ansible_connection to ssh 44071 1727204707.03229: Set connection var ansible_timeout to 10 44071 1727204707.03239: Set connection var ansible_pipelining to False 44071 1727204707.03242: Set connection var ansible_shell_type to sh 44071 1727204707.03245: Set connection var ansible_shell_executable to /bin/sh 44071 1727204707.03255: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204707.03277: variable 'ansible_shell_executable' from source: unknown 44071 1727204707.03280: variable 'ansible_connection' from source: unknown 44071 1727204707.03283: variable 'ansible_module_compression' from source: unknown 44071 1727204707.03286: variable 'ansible_shell_type' from source: unknown 44071 1727204707.03288: variable 'ansible_shell_executable' from source: unknown 44071 1727204707.03291: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204707.03293: variable 'ansible_pipelining' from source: unknown 44071 1727204707.03297: variable 'ansible_timeout' from source: unknown 44071 1727204707.03302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204707.03466: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204707.03481: variable 'omit' from source: magic vars 44071 1727204707.03484: starting attempt loop 44071 1727204707.03487: running the handler 44071 1727204707.03500: _low_level_execute_command(): starting 44071 1727204707.03507: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204707.04068: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204707.04072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204707.04076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204707.04078: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204707.04081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204707.04136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204707.04139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204707.04142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204707.04225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204707.05939: stdout chunk (state=3): >>>/root <<< 44071 1727204707.06050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204707.06117: stderr chunk (state=3): >>><<< 44071 1727204707.06121: stdout chunk (state=3): >>><<< 44071 1727204707.06143: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204707.06156: _low_level_execute_command(): starting 44071 1727204707.06170: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204707.0614204-50849-171648775356073 `" && echo ansible-tmp-1727204707.0614204-50849-171648775356073="` echo /root/.ansible/tmp/ansible-tmp-1727204707.0614204-50849-171648775356073 `" ) && sleep 0' 44071 1727204707.06667: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204707.06671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204707.06674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204707.06684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204707.06687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204707.06732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204707.06736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204707.06817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204707.08793: stdout chunk (state=3): >>>ansible-tmp-1727204707.0614204-50849-171648775356073=/root/.ansible/tmp/ansible-tmp-1727204707.0614204-50849-171648775356073 <<< 44071 1727204707.08962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204707.08969: stderr chunk (state=3): >>><<< 44071 1727204707.08972: stdout chunk (state=3): >>><<< 44071 1727204707.08989: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204707.0614204-50849-171648775356073=/root/.ansible/tmp/ansible-tmp-1727204707.0614204-50849-171648775356073 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204707.09033: variable 'ansible_module_compression' from source: unknown 44071 1727204707.09079: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44071 1727204707.09136: variable 'ansible_facts' from source: unknown 44071 1727204707.09261: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204707.0614204-50849-171648775356073/AnsiballZ_package_facts.py 44071 1727204707.09390: Sending initial data 44071 1727204707.09394: Sent initial data (162 bytes) 44071 1727204707.09891: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204707.09895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204707.09898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204707.09900: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204707.09903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204707.09957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204707.09961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204707.09963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204707.10037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204707.11653: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204707.11720: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204707.11790: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpdkzi8iua /root/.ansible/tmp/ansible-tmp-1727204707.0614204-50849-171648775356073/AnsiballZ_package_facts.py <<< 44071 1727204707.11797: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204707.0614204-50849-171648775356073/AnsiballZ_package_facts.py" <<< 44071 1727204707.11860: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpdkzi8iua" to remote "/root/.ansible/tmp/ansible-tmp-1727204707.0614204-50849-171648775356073/AnsiballZ_package_facts.py" <<< 44071 1727204707.11863: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204707.0614204-50849-171648775356073/AnsiballZ_package_facts.py" <<< 44071 1727204707.13085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204707.13164: stderr chunk (state=3): >>><<< 44071 1727204707.13170: stdout chunk (state=3): >>><<< 44071 1727204707.13189: done transferring module to remote 44071 1727204707.13200: _low_level_execute_command(): starting 44071 1727204707.13205: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204707.0614204-50849-171648775356073/ /root/.ansible/tmp/ansible-tmp-1727204707.0614204-50849-171648775356073/AnsiballZ_package_facts.py && sleep 0' 44071 1727204707.13698: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204707.13702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204707.13705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204707.13711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204707.13757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204707.13760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204707.13836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204707.15660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204707.15720: stderr chunk (state=3): >>><<< 44071 1727204707.15726: stdout chunk (state=3): >>><<< 44071 1727204707.15740: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204707.15744: _low_level_execute_command(): starting 44071 1727204707.15747: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204707.0614204-50849-171648775356073/AnsiballZ_package_facts.py && sleep 0' 44071 1727204707.16244: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204707.16248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204707.16251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204707.16253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204707.16255: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204707.16311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204707.16314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204707.16319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204707.16396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204707.78881: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 44071 1727204707.78999: stdout chunk (state=3): >>>me": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1"<<< 44071 1727204707.79102: stdout chunk (state=3): >>>, "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}]<<< 44071 1727204707.79125: stdout chunk (state=3): >>>, "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44071 1727204707.81050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204707.81058: stderr chunk (state=3): >>><<< 44071 1727204707.81060: stdout chunk (state=3): >>><<< 44071 1727204707.81282: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204707.84360: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204707.0614204-50849-171648775356073/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204707.84397: _low_level_execute_command(): starting 44071 1727204707.84409: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204707.0614204-50849-171648775356073/ > /dev/null 2>&1 && sleep 0' 44071 1727204707.85119: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204707.85141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204707.85159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204707.85229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204707.85285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204707.85304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204707.85338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204707.85451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204707.87431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204707.87528: stderr chunk (state=3): >>><<< 44071 1727204707.87672: stdout chunk (state=3): >>><<< 44071 1727204707.87676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204707.87679: handler run complete 44071 1727204707.88844: variable 'ansible_facts' from source: unknown 44071 1727204707.89543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204707.92618: variable 'ansible_facts' from source: unknown 44071 1727204707.93607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204707.95645: attempt loop complete, returning result 44071 1727204707.95683: _execute() done 44071 1727204707.95695: dumping result to json 44071 1727204707.96021: done dumping result, returning 44071 1727204707.96063: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-c964-7471-000000001d88] 44071 1727204707.96068: sending task result for task 127b8e07-fff9-c964-7471-000000001d88 44071 1727204707.99535: done sending task result for task 127b8e07-fff9-c964-7471-000000001d88 44071 1727204707.99539: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204707.99722: no more pending results, returning what we have 44071 1727204707.99725: results queue empty 44071 1727204707.99726: checking for any_errors_fatal 44071 1727204707.99732: done checking for any_errors_fatal 44071 1727204707.99735: checking for max_fail_percentage 44071 1727204707.99737: done checking for max_fail_percentage 44071 1727204707.99738: checking to see if all hosts have failed and the running result is not ok 44071 1727204707.99739: done checking to see if all hosts have failed 44071 1727204707.99740: getting the remaining hosts for this loop 44071 1727204707.99741: done getting the remaining hosts for this loop 44071 1727204707.99745: getting the next task for host managed-node2 44071 1727204707.99753: done getting next task for host managed-node2 44071 1727204707.99757: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204707.99762: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204707.99982: getting variables 44071 1727204707.99984: in VariableManager get_vars() 44071 1727204708.00022: Calling all_inventory to load vars for managed-node2 44071 1727204708.00025: Calling groups_inventory to load vars for managed-node2 44071 1727204708.00027: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204708.00039: Calling all_plugins_play to load vars for managed-node2 44071 1727204708.00042: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204708.00045: Calling groups_plugins_play to load vars for managed-node2 44071 1727204708.03627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204708.08387: done with get_vars() 44071 1727204708.08438: done getting variables 44071 1727204708.08513: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:05:08 -0400 (0:00:01.068) 0:02:00.402 ***** 44071 1727204708.08556: entering _queue_task() for managed-node2/debug 44071 1727204708.09509: worker is 1 (out of 1 available) 44071 1727204708.09523: exiting _queue_task() for managed-node2/debug 44071 1727204708.09539: done queuing things up, now waiting for results queue to drain 44071 1727204708.09541: waiting for pending results... 44071 1727204708.09788: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204708.09912: in run() - task 127b8e07-fff9-c964-7471-000000001d2c 44071 1727204708.09942: variable 'ansible_search_path' from source: unknown 44071 1727204708.09951: variable 'ansible_search_path' from source: unknown 44071 1727204708.10012: calling self._execute() 44071 1727204708.10149: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204708.10163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204708.10181: variable 'omit' from source: magic vars 44071 1727204708.10751: variable 'ansible_distribution_major_version' from source: facts 44071 1727204708.10758: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204708.10761: variable 'omit' from source: magic vars 44071 1727204708.10821: variable 'omit' from source: magic vars 44071 1727204708.10946: variable 'network_provider' from source: set_fact 44071 1727204708.11073: variable 'omit' from source: magic vars 44071 1727204708.11077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204708.11082: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204708.11102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204708.11128: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204708.11152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204708.11199: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204708.11208: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204708.11217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204708.11350: Set connection var ansible_connection to ssh 44071 1727204708.11363: Set connection var ansible_timeout to 10 44071 1727204708.11375: Set connection var ansible_pipelining to False 44071 1727204708.11399: Set connection var ansible_shell_type to sh 44071 1727204708.11402: Set connection var ansible_shell_executable to /bin/sh 44071 1727204708.11414: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204708.11509: variable 'ansible_shell_executable' from source: unknown 44071 1727204708.11513: variable 'ansible_connection' from source: unknown 44071 1727204708.11516: variable 'ansible_module_compression' from source: unknown 44071 1727204708.11518: variable 'ansible_shell_type' from source: unknown 44071 1727204708.11520: variable 'ansible_shell_executable' from source: unknown 44071 1727204708.11522: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204708.11524: variable 'ansible_pipelining' from source: unknown 44071 1727204708.11526: variable 'ansible_timeout' from source: unknown 44071 1727204708.11528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204708.11727: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204708.11731: variable 'omit' from source: magic vars 44071 1727204708.11736: starting attempt loop 44071 1727204708.11739: running the handler 44071 1727204708.11759: handler run complete 44071 1727204708.11784: attempt loop complete, returning result 44071 1727204708.11791: _execute() done 44071 1727204708.11799: dumping result to json 44071 1727204708.11806: done dumping result, returning 44071 1727204708.11817: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-c964-7471-000000001d2c] 44071 1727204708.11825: sending task result for task 127b8e07-fff9-c964-7471-000000001d2c ok: [managed-node2] => {} MSG: Using network provider: nm 44071 1727204708.12152: no more pending results, returning what we have 44071 1727204708.12156: results queue empty 44071 1727204708.12157: checking for any_errors_fatal 44071 1727204708.12171: done checking for any_errors_fatal 44071 1727204708.12172: checking for max_fail_percentage 44071 1727204708.12174: done checking for max_fail_percentage 44071 1727204708.12175: checking to see if all hosts have failed and the running result is not ok 44071 1727204708.12176: done checking to see if all hosts have failed 44071 1727204708.12177: getting the remaining hosts for this loop 44071 1727204708.12179: done getting the remaining hosts for this loop 44071 1727204708.12185: getting the next task for host managed-node2 44071 1727204708.12196: done getting next task for host managed-node2 44071 1727204708.12201: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204708.12208: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204708.12225: getting variables 44071 1727204708.12226: in VariableManager get_vars() 44071 1727204708.12482: Calling all_inventory to load vars for managed-node2 44071 1727204708.12486: Calling groups_inventory to load vars for managed-node2 44071 1727204708.12493: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204708.12507: Calling all_plugins_play to load vars for managed-node2 44071 1727204708.12510: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204708.12513: Calling groups_plugins_play to load vars for managed-node2 44071 1727204708.13114: done sending task result for task 127b8e07-fff9-c964-7471-000000001d2c 44071 1727204708.13118: WORKER PROCESS EXITING 44071 1727204708.15754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204708.20507: done with get_vars() 44071 1727204708.20562: done getting variables 44071 1727204708.20632: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:05:08 -0400 (0:00:00.123) 0:02:00.525 ***** 44071 1727204708.20890: entering _queue_task() for managed-node2/fail 44071 1727204708.21615: worker is 1 (out of 1 available) 44071 1727204708.21636: exiting _queue_task() for managed-node2/fail 44071 1727204708.21653: done queuing things up, now waiting for results queue to drain 44071 1727204708.21655: waiting for pending results... 44071 1727204708.22171: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204708.22669: in run() - task 127b8e07-fff9-c964-7471-000000001d2d 44071 1727204708.22697: variable 'ansible_search_path' from source: unknown 44071 1727204708.22706: variable 'ansible_search_path' from source: unknown 44071 1727204708.22755: calling self._execute() 44071 1727204708.23013: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204708.23027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204708.23084: variable 'omit' from source: magic vars 44071 1727204708.24056: variable 'ansible_distribution_major_version' from source: facts 44071 1727204708.24118: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204708.24476: variable 'network_state' from source: role '' defaults 44071 1727204708.24498: Evaluated conditional (network_state != {}): False 44071 1727204708.24538: when evaluation is False, skipping this task 44071 1727204708.24752: _execute() done 44071 1727204708.24756: dumping result to json 44071 1727204708.24759: done dumping result, returning 44071 1727204708.24763: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-c964-7471-000000001d2d] 44071 1727204708.24768: sending task result for task 127b8e07-fff9-c964-7471-000000001d2d 44071 1727204708.25080: done sending task result for task 127b8e07-fff9-c964-7471-000000001d2d 44071 1727204708.25085: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204708.25151: no more pending results, returning what we have 44071 1727204708.25155: results queue empty 44071 1727204708.25156: checking for any_errors_fatal 44071 1727204708.25165: done checking for any_errors_fatal 44071 1727204708.25170: checking for max_fail_percentage 44071 1727204708.25171: done checking for max_fail_percentage 44071 1727204708.25173: checking to see if all hosts have failed and the running result is not ok 44071 1727204708.25173: done checking to see if all hosts have failed 44071 1727204708.25174: getting the remaining hosts for this loop 44071 1727204708.25176: done getting the remaining hosts for this loop 44071 1727204708.25182: getting the next task for host managed-node2 44071 1727204708.25191: done getting next task for host managed-node2 44071 1727204708.25196: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204708.25203: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204708.25239: getting variables 44071 1727204708.25240: in VariableManager get_vars() 44071 1727204708.25396: Calling all_inventory to load vars for managed-node2 44071 1727204708.25399: Calling groups_inventory to load vars for managed-node2 44071 1727204708.25402: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204708.25415: Calling all_plugins_play to load vars for managed-node2 44071 1727204708.25419: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204708.25422: Calling groups_plugins_play to load vars for managed-node2 44071 1727204708.29196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204708.33942: done with get_vars() 44071 1727204708.33993: done getting variables 44071 1727204708.34064: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:05:08 -0400 (0:00:00.134) 0:02:00.659 ***** 44071 1727204708.34311: entering _queue_task() for managed-node2/fail 44071 1727204708.35151: worker is 1 (out of 1 available) 44071 1727204708.35169: exiting _queue_task() for managed-node2/fail 44071 1727204708.35185: done queuing things up, now waiting for results queue to drain 44071 1727204708.35187: waiting for pending results... 44071 1727204708.35700: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204708.36373: in run() - task 127b8e07-fff9-c964-7471-000000001d2e 44071 1727204708.36378: variable 'ansible_search_path' from source: unknown 44071 1727204708.36381: variable 'ansible_search_path' from source: unknown 44071 1727204708.36385: calling self._execute() 44071 1727204708.36433: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204708.36481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204708.36497: variable 'omit' from source: magic vars 44071 1727204708.37575: variable 'ansible_distribution_major_version' from source: facts 44071 1727204708.37580: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204708.37972: variable 'network_state' from source: role '' defaults 44071 1727204708.37976: Evaluated conditional (network_state != {}): False 44071 1727204708.37979: when evaluation is False, skipping this task 44071 1727204708.37983: _execute() done 44071 1727204708.37985: dumping result to json 44071 1727204708.37988: done dumping result, returning 44071 1727204708.37994: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-c964-7471-000000001d2e] 44071 1727204708.37998: sending task result for task 127b8e07-fff9-c964-7471-000000001d2e 44071 1727204708.38092: done sending task result for task 127b8e07-fff9-c964-7471-000000001d2e 44071 1727204708.38096: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204708.38152: no more pending results, returning what we have 44071 1727204708.38155: results queue empty 44071 1727204708.38156: checking for any_errors_fatal 44071 1727204708.38167: done checking for any_errors_fatal 44071 1727204708.38168: checking for max_fail_percentage 44071 1727204708.38170: done checking for max_fail_percentage 44071 1727204708.38171: checking to see if all hosts have failed and the running result is not ok 44071 1727204708.38171: done checking to see if all hosts have failed 44071 1727204708.38172: getting the remaining hosts for this loop 44071 1727204708.38174: done getting the remaining hosts for this loop 44071 1727204708.38179: getting the next task for host managed-node2 44071 1727204708.38189: done getting next task for host managed-node2 44071 1727204708.38193: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204708.38201: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204708.38251: getting variables 44071 1727204708.38253: in VariableManager get_vars() 44071 1727204708.38604: Calling all_inventory to load vars for managed-node2 44071 1727204708.38607: Calling groups_inventory to load vars for managed-node2 44071 1727204708.38609: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204708.38620: Calling all_plugins_play to load vars for managed-node2 44071 1727204708.38622: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204708.38625: Calling groups_plugins_play to load vars for managed-node2 44071 1727204708.43481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204708.48047: done with get_vars() 44071 1727204708.48298: done getting variables 44071 1727204708.48371: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:05:08 -0400 (0:00:00.141) 0:02:00.800 ***** 44071 1727204708.48414: entering _queue_task() for managed-node2/fail 44071 1727204708.49249: worker is 1 (out of 1 available) 44071 1727204708.49264: exiting _queue_task() for managed-node2/fail 44071 1727204708.49482: done queuing things up, now waiting for results queue to drain 44071 1727204708.49484: waiting for pending results... 44071 1727204708.49988: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204708.50286: in run() - task 127b8e07-fff9-c964-7471-000000001d2f 44071 1727204708.50311: variable 'ansible_search_path' from source: unknown 44071 1727204708.50320: variable 'ansible_search_path' from source: unknown 44071 1727204708.50370: calling self._execute() 44071 1727204708.50771: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204708.50776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204708.50779: variable 'omit' from source: magic vars 44071 1727204708.51528: variable 'ansible_distribution_major_version' from source: facts 44071 1727204708.51550: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204708.51975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204708.56788: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204708.56879: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204708.56929: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204708.56983: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204708.57016: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204708.57119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204708.57187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204708.57222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204708.57283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204708.57308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204708.57431: variable 'ansible_distribution_major_version' from source: facts 44071 1727204708.57473: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44071 1727204708.57608: variable 'ansible_distribution' from source: facts 44071 1727204708.57700: variable '__network_rh_distros' from source: role '' defaults 44071 1727204708.57702: Evaluated conditional (ansible_distribution in __network_rh_distros): False 44071 1727204708.57705: when evaluation is False, skipping this task 44071 1727204708.57707: _execute() done 44071 1727204708.57708: dumping result to json 44071 1727204708.57711: done dumping result, returning 44071 1727204708.57713: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-c964-7471-000000001d2f] 44071 1727204708.57715: sending task result for task 127b8e07-fff9-c964-7471-000000001d2f 44071 1727204708.57796: done sending task result for task 127b8e07-fff9-c964-7471-000000001d2f 44071 1727204708.57800: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 44071 1727204708.57859: no more pending results, returning what we have 44071 1727204708.57863: results queue empty 44071 1727204708.57868: checking for any_errors_fatal 44071 1727204708.57875: done checking for any_errors_fatal 44071 1727204708.57876: checking for max_fail_percentage 44071 1727204708.57878: done checking for max_fail_percentage 44071 1727204708.57879: checking to see if all hosts have failed and the running result is not ok 44071 1727204708.57880: done checking to see if all hosts have failed 44071 1727204708.57881: getting the remaining hosts for this loop 44071 1727204708.57884: done getting the remaining hosts for this loop 44071 1727204708.57889: getting the next task for host managed-node2 44071 1727204708.57900: done getting next task for host managed-node2 44071 1727204708.57905: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204708.57911: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204708.57943: getting variables 44071 1727204708.57946: in VariableManager get_vars() 44071 1727204708.58004: Calling all_inventory to load vars for managed-node2 44071 1727204708.58007: Calling groups_inventory to load vars for managed-node2 44071 1727204708.58010: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204708.58024: Calling all_plugins_play to load vars for managed-node2 44071 1727204708.58028: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204708.58031: Calling groups_plugins_play to load vars for managed-node2 44071 1727204708.60507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204708.62780: done with get_vars() 44071 1727204708.62823: done getting variables 44071 1727204708.62897: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:05:08 -0400 (0:00:00.145) 0:02:00.945 ***** 44071 1727204708.62936: entering _queue_task() for managed-node2/dnf 44071 1727204708.63473: worker is 1 (out of 1 available) 44071 1727204708.63489: exiting _queue_task() for managed-node2/dnf 44071 1727204708.63501: done queuing things up, now waiting for results queue to drain 44071 1727204708.63502: waiting for pending results... 44071 1727204708.63871: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204708.63944: in run() - task 127b8e07-fff9-c964-7471-000000001d30 44071 1727204708.63975: variable 'ansible_search_path' from source: unknown 44071 1727204708.63985: variable 'ansible_search_path' from source: unknown 44071 1727204708.64038: calling self._execute() 44071 1727204708.64174: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204708.64178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204708.64236: variable 'omit' from source: magic vars 44071 1727204708.64663: variable 'ansible_distribution_major_version' from source: facts 44071 1727204708.64692: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204708.64948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204708.67723: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204708.67876: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204708.67881: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204708.67900: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204708.67932: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204708.68036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204708.68100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204708.68132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204708.68181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204708.68208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204708.68356: variable 'ansible_distribution' from source: facts 44071 1727204708.68371: variable 'ansible_distribution_major_version' from source: facts 44071 1727204708.68384: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44071 1727204708.68888: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204708.68956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204708.69116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204708.69149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204708.69200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204708.69288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204708.69401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204708.69649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204708.69652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204708.69655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204708.69657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204708.69800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204708.69832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204708.70085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204708.70089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204708.70091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204708.70477: variable 'network_connections' from source: include params 44071 1727204708.70497: variable 'interface' from source: play vars 44071 1727204708.70700: variable 'interface' from source: play vars 44071 1727204708.70883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204708.71425: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204708.71524: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204708.71562: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204708.71773: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204708.71921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204708.71950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204708.72318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204708.72322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204708.72362: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204708.72962: variable 'network_connections' from source: include params 44071 1727204708.73112: variable 'interface' from source: play vars 44071 1727204708.73194: variable 'interface' from source: play vars 44071 1727204708.73302: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204708.73312: when evaluation is False, skipping this task 44071 1727204708.73321: _execute() done 44071 1727204708.73329: dumping result to json 44071 1727204708.73471: done dumping result, returning 44071 1727204708.73475: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001d30] 44071 1727204708.73478: sending task result for task 127b8e07-fff9-c964-7471-000000001d30 44071 1727204708.73623: done sending task result for task 127b8e07-fff9-c964-7471-000000001d30 44071 1727204708.73627: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204708.73703: no more pending results, returning what we have 44071 1727204708.73707: results queue empty 44071 1727204708.73708: checking for any_errors_fatal 44071 1727204708.73717: done checking for any_errors_fatal 44071 1727204708.73719: checking for max_fail_percentage 44071 1727204708.73720: done checking for max_fail_percentage 44071 1727204708.73722: checking to see if all hosts have failed and the running result is not ok 44071 1727204708.73723: done checking to see if all hosts have failed 44071 1727204708.73723: getting the remaining hosts for this loop 44071 1727204708.73725: done getting the remaining hosts for this loop 44071 1727204708.73731: getting the next task for host managed-node2 44071 1727204708.73741: done getting next task for host managed-node2 44071 1727204708.73746: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204708.73752: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204708.73787: getting variables 44071 1727204708.73789: in VariableManager get_vars() 44071 1727204708.73842: Calling all_inventory to load vars for managed-node2 44071 1727204708.73845: Calling groups_inventory to load vars for managed-node2 44071 1727204708.73848: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204708.73861: Calling all_plugins_play to load vars for managed-node2 44071 1727204708.74171: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204708.74178: Calling groups_plugins_play to load vars for managed-node2 44071 1727204708.78247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204708.82437: done with get_vars() 44071 1727204708.82494: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204708.82588: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:05:08 -0400 (0:00:00.196) 0:02:01.142 ***** 44071 1727204708.82630: entering _queue_task() for managed-node2/yum 44071 1727204708.83300: worker is 1 (out of 1 available) 44071 1727204708.83316: exiting _queue_task() for managed-node2/yum 44071 1727204708.83329: done queuing things up, now waiting for results queue to drain 44071 1727204708.83331: waiting for pending results... 44071 1727204708.83585: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204708.83638: in run() - task 127b8e07-fff9-c964-7471-000000001d31 44071 1727204708.83667: variable 'ansible_search_path' from source: unknown 44071 1727204708.83712: variable 'ansible_search_path' from source: unknown 44071 1727204708.83735: calling self._execute() 44071 1727204708.83869: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204708.83883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204708.83897: variable 'omit' from source: magic vars 44071 1727204708.84367: variable 'ansible_distribution_major_version' from source: facts 44071 1727204708.84475: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204708.84605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204708.87693: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204708.88077: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204708.88083: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204708.88114: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204708.88156: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204708.88258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204708.88449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204708.88581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204708.88635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204708.88691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204708.88974: variable 'ansible_distribution_major_version' from source: facts 44071 1727204708.89017: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44071 1727204708.89061: when evaluation is False, skipping this task 44071 1727204708.89071: _execute() done 44071 1727204708.89078: dumping result to json 44071 1727204708.89085: done dumping result, returning 44071 1727204708.89096: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001d31] 44071 1727204708.89109: sending task result for task 127b8e07-fff9-c964-7471-000000001d31 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44071 1727204708.89458: no more pending results, returning what we have 44071 1727204708.89461: results queue empty 44071 1727204708.89463: checking for any_errors_fatal 44071 1727204708.89474: done checking for any_errors_fatal 44071 1727204708.89475: checking for max_fail_percentage 44071 1727204708.89477: done checking for max_fail_percentage 44071 1727204708.89478: checking to see if all hosts have failed and the running result is not ok 44071 1727204708.89479: done checking to see if all hosts have failed 44071 1727204708.89480: getting the remaining hosts for this loop 44071 1727204708.89482: done getting the remaining hosts for this loop 44071 1727204708.89487: getting the next task for host managed-node2 44071 1727204708.89497: done getting next task for host managed-node2 44071 1727204708.89502: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204708.89508: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204708.89536: getting variables 44071 1727204708.89538: in VariableManager get_vars() 44071 1727204708.89706: Calling all_inventory to load vars for managed-node2 44071 1727204708.89709: Calling groups_inventory to load vars for managed-node2 44071 1727204708.89712: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204708.89726: Calling all_plugins_play to load vars for managed-node2 44071 1727204708.89729: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204708.89733: Calling groups_plugins_play to load vars for managed-node2 44071 1727204708.90867: done sending task result for task 127b8e07-fff9-c964-7471-000000001d31 44071 1727204708.90871: WORKER PROCESS EXITING 44071 1727204708.94043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204708.96211: done with get_vars() 44071 1727204708.96256: done getting variables 44071 1727204708.96327: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:05:08 -0400 (0:00:00.137) 0:02:01.280 ***** 44071 1727204708.96368: entering _queue_task() for managed-node2/fail 44071 1727204708.96783: worker is 1 (out of 1 available) 44071 1727204708.96800: exiting _queue_task() for managed-node2/fail 44071 1727204708.96815: done queuing things up, now waiting for results queue to drain 44071 1727204708.96818: waiting for pending results... 44071 1727204708.97153: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204708.97343: in run() - task 127b8e07-fff9-c964-7471-000000001d32 44071 1727204708.97371: variable 'ansible_search_path' from source: unknown 44071 1727204708.97381: variable 'ansible_search_path' from source: unknown 44071 1727204708.97432: calling self._execute() 44071 1727204708.97553: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204708.97571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204708.97586: variable 'omit' from source: magic vars 44071 1727204708.98027: variable 'ansible_distribution_major_version' from source: facts 44071 1727204708.98048: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204708.98188: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204708.98417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204709.01019: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204709.01113: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204709.01162: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204709.01210: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204709.01244: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204709.01343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.01400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.01441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.01494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.01512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.01578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.01608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.01640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.01691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.01711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.01767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.01798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.01829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.01881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.01901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.02109: variable 'network_connections' from source: include params 44071 1727204709.02127: variable 'interface' from source: play vars 44071 1727204709.02212: variable 'interface' from source: play vars 44071 1727204709.02304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204709.02502: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204709.02554: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204709.02595: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204709.02671: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204709.02691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204709.02717: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204709.02752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.02789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204709.02855: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204709.03155: variable 'network_connections' from source: include params 44071 1727204709.03270: variable 'interface' from source: play vars 44071 1727204709.03274: variable 'interface' from source: play vars 44071 1727204709.03288: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204709.03297: when evaluation is False, skipping this task 44071 1727204709.03303: _execute() done 44071 1727204709.03311: dumping result to json 44071 1727204709.03318: done dumping result, returning 44071 1727204709.03330: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001d32] 44071 1727204709.03340: sending task result for task 127b8e07-fff9-c964-7471-000000001d32 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204709.03626: no more pending results, returning what we have 44071 1727204709.03630: results queue empty 44071 1727204709.03631: checking for any_errors_fatal 44071 1727204709.03641: done checking for any_errors_fatal 44071 1727204709.03641: checking for max_fail_percentage 44071 1727204709.03643: done checking for max_fail_percentage 44071 1727204709.03644: checking to see if all hosts have failed and the running result is not ok 44071 1727204709.03645: done checking to see if all hosts have failed 44071 1727204709.03646: getting the remaining hosts for this loop 44071 1727204709.03648: done getting the remaining hosts for this loop 44071 1727204709.03653: getting the next task for host managed-node2 44071 1727204709.03663: done getting next task for host managed-node2 44071 1727204709.03670: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44071 1727204709.03676: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204709.03706: getting variables 44071 1727204709.03708: in VariableManager get_vars() 44071 1727204709.03760: Calling all_inventory to load vars for managed-node2 44071 1727204709.03763: Calling groups_inventory to load vars for managed-node2 44071 1727204709.03972: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204709.03986: Calling all_plugins_play to load vars for managed-node2 44071 1727204709.03989: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204709.03992: Calling groups_plugins_play to load vars for managed-node2 44071 1727204709.04705: done sending task result for task 127b8e07-fff9-c964-7471-000000001d32 44071 1727204709.04710: WORKER PROCESS EXITING 44071 1727204709.06746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204709.09154: done with get_vars() 44071 1727204709.09205: done getting variables 44071 1727204709.09286: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:05:09 -0400 (0:00:00.129) 0:02:01.409 ***** 44071 1727204709.09332: entering _queue_task() for managed-node2/package 44071 1727204709.09883: worker is 1 (out of 1 available) 44071 1727204709.09901: exiting _queue_task() for managed-node2/package 44071 1727204709.09915: done queuing things up, now waiting for results queue to drain 44071 1727204709.09917: waiting for pending results... 44071 1727204709.10155: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 44071 1727204709.10266: in run() - task 127b8e07-fff9-c964-7471-000000001d33 44071 1727204709.10284: variable 'ansible_search_path' from source: unknown 44071 1727204709.10288: variable 'ansible_search_path' from source: unknown 44071 1727204709.10322: calling self._execute() 44071 1727204709.10429: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204709.10435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204709.10445: variable 'omit' from source: magic vars 44071 1727204709.10935: variable 'ansible_distribution_major_version' from source: facts 44071 1727204709.10939: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204709.11135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204709.11443: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204709.11500: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204709.11532: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204709.11635: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204709.11773: variable 'network_packages' from source: role '' defaults 44071 1727204709.11892: variable '__network_provider_setup' from source: role '' defaults 44071 1727204709.11904: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204709.11957: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204709.11964: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204709.12012: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204709.12177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204709.14534: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204709.14579: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204709.14620: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204709.14658: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204709.14728: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204709.14860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.14871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.14876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.14879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.14909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.14946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.14978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.14998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.15043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.15059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.15370: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204709.15450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.15476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.15501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.15547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.15564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.15669: variable 'ansible_python' from source: facts 44071 1727204709.15771: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204709.15795: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204709.15890: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204709.16032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.16072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.16098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.16149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.16175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.16251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.16263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.16294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.16371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.16375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.16505: variable 'network_connections' from source: include params 44071 1727204709.16514: variable 'interface' from source: play vars 44071 1727204709.16636: variable 'interface' from source: play vars 44071 1727204709.16743: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204709.16776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204709.16832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.16853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204709.16907: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204709.17267: variable 'network_connections' from source: include params 44071 1727204709.17277: variable 'interface' from source: play vars 44071 1727204709.17396: variable 'interface' from source: play vars 44071 1727204709.17400: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204709.17490: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204709.17772: variable 'network_connections' from source: include params 44071 1727204709.17779: variable 'interface' from source: play vars 44071 1727204709.17843: variable 'interface' from source: play vars 44071 1727204709.17868: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204709.17961: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204709.18432: variable 'network_connections' from source: include params 44071 1727204709.18436: variable 'interface' from source: play vars 44071 1727204709.18439: variable 'interface' from source: play vars 44071 1727204709.18441: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204709.18506: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204709.18509: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204709.18674: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204709.18806: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204709.19645: variable 'network_connections' from source: include params 44071 1727204709.19650: variable 'interface' from source: play vars 44071 1727204709.19723: variable 'interface' from source: play vars 44071 1727204709.19732: variable 'ansible_distribution' from source: facts 44071 1727204709.19735: variable '__network_rh_distros' from source: role '' defaults 44071 1727204709.19745: variable 'ansible_distribution_major_version' from source: facts 44071 1727204709.19760: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204709.19992: variable 'ansible_distribution' from source: facts 44071 1727204709.19995: variable '__network_rh_distros' from source: role '' defaults 44071 1727204709.19998: variable 'ansible_distribution_major_version' from source: facts 44071 1727204709.20000: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204709.20284: variable 'ansible_distribution' from source: facts 44071 1727204709.20288: variable '__network_rh_distros' from source: role '' defaults 44071 1727204709.20294: variable 'ansible_distribution_major_version' from source: facts 44071 1727204709.20346: variable 'network_provider' from source: set_fact 44071 1727204709.20367: variable 'ansible_facts' from source: unknown 44071 1727204709.21344: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44071 1727204709.21348: when evaluation is False, skipping this task 44071 1727204709.21351: _execute() done 44071 1727204709.21358: dumping result to json 44071 1727204709.21360: done dumping result, returning 44071 1727204709.21363: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-c964-7471-000000001d33] 44071 1727204709.21377: sending task result for task 127b8e07-fff9-c964-7471-000000001d33 44071 1727204709.21488: done sending task result for task 127b8e07-fff9-c964-7471-000000001d33 44071 1727204709.21491: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44071 1727204709.21577: no more pending results, returning what we have 44071 1727204709.21581: results queue empty 44071 1727204709.21582: checking for any_errors_fatal 44071 1727204709.21591: done checking for any_errors_fatal 44071 1727204709.21592: checking for max_fail_percentage 44071 1727204709.21594: done checking for max_fail_percentage 44071 1727204709.21595: checking to see if all hosts have failed and the running result is not ok 44071 1727204709.21596: done checking to see if all hosts have failed 44071 1727204709.21596: getting the remaining hosts for this loop 44071 1727204709.21598: done getting the remaining hosts for this loop 44071 1727204709.21603: getting the next task for host managed-node2 44071 1727204709.21612: done getting next task for host managed-node2 44071 1727204709.21617: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204709.21622: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204709.21653: getting variables 44071 1727204709.21655: in VariableManager get_vars() 44071 1727204709.21710: Calling all_inventory to load vars for managed-node2 44071 1727204709.21713: Calling groups_inventory to load vars for managed-node2 44071 1727204709.21716: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204709.21726: Calling all_plugins_play to load vars for managed-node2 44071 1727204709.21729: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204709.21732: Calling groups_plugins_play to load vars for managed-node2 44071 1727204709.23130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204709.25211: done with get_vars() 44071 1727204709.25250: done getting variables 44071 1727204709.25306: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:05:09 -0400 (0:00:00.160) 0:02:01.569 ***** 44071 1727204709.25338: entering _queue_task() for managed-node2/package 44071 1727204709.25658: worker is 1 (out of 1 available) 44071 1727204709.25675: exiting _queue_task() for managed-node2/package 44071 1727204709.25691: done queuing things up, now waiting for results queue to drain 44071 1727204709.25693: waiting for pending results... 44071 1727204709.25904: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204709.26037: in run() - task 127b8e07-fff9-c964-7471-000000001d34 44071 1727204709.26042: variable 'ansible_search_path' from source: unknown 44071 1727204709.26045: variable 'ansible_search_path' from source: unknown 44071 1727204709.26084: calling self._execute() 44071 1727204709.26177: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204709.26183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204709.26193: variable 'omit' from source: magic vars 44071 1727204709.26531: variable 'ansible_distribution_major_version' from source: facts 44071 1727204709.26543: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204709.26643: variable 'network_state' from source: role '' defaults 44071 1727204709.26654: Evaluated conditional (network_state != {}): False 44071 1727204709.26658: when evaluation is False, skipping this task 44071 1727204709.26661: _execute() done 44071 1727204709.26664: dumping result to json 44071 1727204709.26670: done dumping result, returning 44071 1727204709.26679: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-c964-7471-000000001d34] 44071 1727204709.26682: sending task result for task 127b8e07-fff9-c964-7471-000000001d34 44071 1727204709.26795: done sending task result for task 127b8e07-fff9-c964-7471-000000001d34 44071 1727204709.26799: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204709.26854: no more pending results, returning what we have 44071 1727204709.26858: results queue empty 44071 1727204709.26859: checking for any_errors_fatal 44071 1727204709.26869: done checking for any_errors_fatal 44071 1727204709.26870: checking for max_fail_percentage 44071 1727204709.26871: done checking for max_fail_percentage 44071 1727204709.26872: checking to see if all hosts have failed and the running result is not ok 44071 1727204709.26873: done checking to see if all hosts have failed 44071 1727204709.26874: getting the remaining hosts for this loop 44071 1727204709.26875: done getting the remaining hosts for this loop 44071 1727204709.26881: getting the next task for host managed-node2 44071 1727204709.26891: done getting next task for host managed-node2 44071 1727204709.26895: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204709.26900: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204709.26929: getting variables 44071 1727204709.26931: in VariableManager get_vars() 44071 1727204709.26987: Calling all_inventory to load vars for managed-node2 44071 1727204709.26990: Calling groups_inventory to load vars for managed-node2 44071 1727204709.26992: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204709.27003: Calling all_plugins_play to load vars for managed-node2 44071 1727204709.27006: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204709.27009: Calling groups_plugins_play to load vars for managed-node2 44071 1727204709.28231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204709.29451: done with get_vars() 44071 1727204709.29488: done getting variables 44071 1727204709.29540: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:05:09 -0400 (0:00:00.042) 0:02:01.612 ***** 44071 1727204709.29573: entering _queue_task() for managed-node2/package 44071 1727204709.29886: worker is 1 (out of 1 available) 44071 1727204709.29902: exiting _queue_task() for managed-node2/package 44071 1727204709.29917: done queuing things up, now waiting for results queue to drain 44071 1727204709.29919: waiting for pending results... 44071 1727204709.30138: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204709.30265: in run() - task 127b8e07-fff9-c964-7471-000000001d35 44071 1727204709.30278: variable 'ansible_search_path' from source: unknown 44071 1727204709.30282: variable 'ansible_search_path' from source: unknown 44071 1727204709.30315: calling self._execute() 44071 1727204709.30411: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204709.30418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204709.30427: variable 'omit' from source: magic vars 44071 1727204709.30768: variable 'ansible_distribution_major_version' from source: facts 44071 1727204709.30779: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204709.30881: variable 'network_state' from source: role '' defaults 44071 1727204709.30892: Evaluated conditional (network_state != {}): False 44071 1727204709.30895: when evaluation is False, skipping this task 44071 1727204709.30898: _execute() done 44071 1727204709.30903: dumping result to json 44071 1727204709.30906: done dumping result, returning 44071 1727204709.30912: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-c964-7471-000000001d35] 44071 1727204709.30920: sending task result for task 127b8e07-fff9-c964-7471-000000001d35 44071 1727204709.31037: done sending task result for task 127b8e07-fff9-c964-7471-000000001d35 44071 1727204709.31040: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204709.31093: no more pending results, returning what we have 44071 1727204709.31097: results queue empty 44071 1727204709.31098: checking for any_errors_fatal 44071 1727204709.31107: done checking for any_errors_fatal 44071 1727204709.31107: checking for max_fail_percentage 44071 1727204709.31109: done checking for max_fail_percentage 44071 1727204709.31111: checking to see if all hosts have failed and the running result is not ok 44071 1727204709.31111: done checking to see if all hosts have failed 44071 1727204709.31112: getting the remaining hosts for this loop 44071 1727204709.31114: done getting the remaining hosts for this loop 44071 1727204709.31119: getting the next task for host managed-node2 44071 1727204709.31136: done getting next task for host managed-node2 44071 1727204709.31141: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204709.31147: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204709.31174: getting variables 44071 1727204709.31176: in VariableManager get_vars() 44071 1727204709.31221: Calling all_inventory to load vars for managed-node2 44071 1727204709.31224: Calling groups_inventory to load vars for managed-node2 44071 1727204709.31226: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204709.31244: Calling all_plugins_play to load vars for managed-node2 44071 1727204709.31247: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204709.31250: Calling groups_plugins_play to load vars for managed-node2 44071 1727204709.32427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204709.39395: done with get_vars() 44071 1727204709.39423: done getting variables 44071 1727204709.39467: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:05:09 -0400 (0:00:00.099) 0:02:01.711 ***** 44071 1727204709.39494: entering _queue_task() for managed-node2/service 44071 1727204709.39804: worker is 1 (out of 1 available) 44071 1727204709.39821: exiting _queue_task() for managed-node2/service 44071 1727204709.39836: done queuing things up, now waiting for results queue to drain 44071 1727204709.39838: waiting for pending results... 44071 1727204709.40051: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204709.40177: in run() - task 127b8e07-fff9-c964-7471-000000001d36 44071 1727204709.40191: variable 'ansible_search_path' from source: unknown 44071 1727204709.40194: variable 'ansible_search_path' from source: unknown 44071 1727204709.40227: calling self._execute() 44071 1727204709.40321: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204709.40326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204709.40335: variable 'omit' from source: magic vars 44071 1727204709.40679: variable 'ansible_distribution_major_version' from source: facts 44071 1727204709.40690: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204709.40797: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204709.40953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204709.42731: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204709.42798: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204709.42830: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204709.42861: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204709.42887: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204709.42957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.42981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.43002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.43033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.43046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.43086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.43103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.43124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.43156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.43168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.43199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.43220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.43241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.43269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.43280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.43421: variable 'network_connections' from source: include params 44071 1727204709.43434: variable 'interface' from source: play vars 44071 1727204709.43497: variable 'interface' from source: play vars 44071 1727204709.43561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204709.43703: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204709.43733: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204709.43758: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204709.43789: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204709.43822: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204709.43841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204709.43860: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.43883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204709.43925: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204709.44110: variable 'network_connections' from source: include params 44071 1727204709.44115: variable 'interface' from source: play vars 44071 1727204709.44168: variable 'interface' from source: play vars 44071 1727204709.44187: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204709.44191: when evaluation is False, skipping this task 44071 1727204709.44195: _execute() done 44071 1727204709.44198: dumping result to json 44071 1727204709.44200: done dumping result, returning 44071 1727204709.44210: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000001d36] 44071 1727204709.44213: sending task result for task 127b8e07-fff9-c964-7471-000000001d36 44071 1727204709.44320: done sending task result for task 127b8e07-fff9-c964-7471-000000001d36 44071 1727204709.44331: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204709.44383: no more pending results, returning what we have 44071 1727204709.44386: results queue empty 44071 1727204709.44387: checking for any_errors_fatal 44071 1727204709.44398: done checking for any_errors_fatal 44071 1727204709.44399: checking for max_fail_percentage 44071 1727204709.44401: done checking for max_fail_percentage 44071 1727204709.44402: checking to see if all hosts have failed and the running result is not ok 44071 1727204709.44403: done checking to see if all hosts have failed 44071 1727204709.44403: getting the remaining hosts for this loop 44071 1727204709.44405: done getting the remaining hosts for this loop 44071 1727204709.44410: getting the next task for host managed-node2 44071 1727204709.44420: done getting next task for host managed-node2 44071 1727204709.44424: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204709.44429: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204709.44456: getting variables 44071 1727204709.44458: in VariableManager get_vars() 44071 1727204709.44504: Calling all_inventory to load vars for managed-node2 44071 1727204709.44506: Calling groups_inventory to load vars for managed-node2 44071 1727204709.44508: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204709.44519: Calling all_plugins_play to load vars for managed-node2 44071 1727204709.44521: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204709.44524: Calling groups_plugins_play to load vars for managed-node2 44071 1727204709.45593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204709.46837: done with get_vars() 44071 1727204709.46873: done getting variables 44071 1727204709.46923: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:05:09 -0400 (0:00:00.074) 0:02:01.785 ***** 44071 1727204709.46952: entering _queue_task() for managed-node2/service 44071 1727204709.47254: worker is 1 (out of 1 available) 44071 1727204709.47273: exiting _queue_task() for managed-node2/service 44071 1727204709.47290: done queuing things up, now waiting for results queue to drain 44071 1727204709.47291: waiting for pending results... 44071 1727204709.47512: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204709.47649: in run() - task 127b8e07-fff9-c964-7471-000000001d37 44071 1727204709.47663: variable 'ansible_search_path' from source: unknown 44071 1727204709.47667: variable 'ansible_search_path' from source: unknown 44071 1727204709.47700: calling self._execute() 44071 1727204709.47792: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204709.47800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204709.47808: variable 'omit' from source: magic vars 44071 1727204709.48141: variable 'ansible_distribution_major_version' from source: facts 44071 1727204709.48153: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204709.48282: variable 'network_provider' from source: set_fact 44071 1727204709.48288: variable 'network_state' from source: role '' defaults 44071 1727204709.48298: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44071 1727204709.48304: variable 'omit' from source: magic vars 44071 1727204709.48355: variable 'omit' from source: magic vars 44071 1727204709.48379: variable 'network_service_name' from source: role '' defaults 44071 1727204709.48439: variable 'network_service_name' from source: role '' defaults 44071 1727204709.48516: variable '__network_provider_setup' from source: role '' defaults 44071 1727204709.48521: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204709.48572: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204709.48580: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204709.48627: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204709.48797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204709.50893: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204709.50947: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204709.50989: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204709.51018: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204709.51040: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204709.51108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.51130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.51155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.51185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.51196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.51236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.51254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.51274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.51300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.51311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.51485: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204709.51579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.51598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.51617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.51645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.51656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.51731: variable 'ansible_python' from source: facts 44071 1727204709.51745: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204709.51812: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204709.51870: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204709.51967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.51988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.52007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.52039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.52049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.52088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204709.52111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204709.52130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.52158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204709.52170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204709.52275: variable 'network_connections' from source: include params 44071 1727204709.52282: variable 'interface' from source: play vars 44071 1727204709.52342: variable 'interface' from source: play vars 44071 1727204709.52426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204709.52566: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204709.52616: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204709.52652: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204709.52688: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204709.52737: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204709.52758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204709.52788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204709.52812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204709.52854: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204709.53059: variable 'network_connections' from source: include params 44071 1727204709.53066: variable 'interface' from source: play vars 44071 1727204709.53126: variable 'interface' from source: play vars 44071 1727204709.53153: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204709.53216: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204709.53421: variable 'network_connections' from source: include params 44071 1727204709.53426: variable 'interface' from source: play vars 44071 1727204709.53480: variable 'interface' from source: play vars 44071 1727204709.53498: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204709.53558: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204709.53769: variable 'network_connections' from source: include params 44071 1727204709.53773: variable 'interface' from source: play vars 44071 1727204709.53825: variable 'interface' from source: play vars 44071 1727204709.53869: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204709.53914: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204709.53920: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204709.53969: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204709.54121: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204709.54464: variable 'network_connections' from source: include params 44071 1727204709.54469: variable 'interface' from source: play vars 44071 1727204709.54519: variable 'interface' from source: play vars 44071 1727204709.54526: variable 'ansible_distribution' from source: facts 44071 1727204709.54529: variable '__network_rh_distros' from source: role '' defaults 44071 1727204709.54537: variable 'ansible_distribution_major_version' from source: facts 44071 1727204709.54548: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204709.54675: variable 'ansible_distribution' from source: facts 44071 1727204709.54679: variable '__network_rh_distros' from source: role '' defaults 44071 1727204709.54684: variable 'ansible_distribution_major_version' from source: facts 44071 1727204709.54690: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204709.54817: variable 'ansible_distribution' from source: facts 44071 1727204709.54822: variable '__network_rh_distros' from source: role '' defaults 44071 1727204709.54825: variable 'ansible_distribution_major_version' from source: facts 44071 1727204709.54853: variable 'network_provider' from source: set_fact 44071 1727204709.54875: variable 'omit' from source: magic vars 44071 1727204709.54901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204709.54925: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204709.54944: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204709.54961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204709.54972: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204709.54998: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204709.55002: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204709.55004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204709.55083: Set connection var ansible_connection to ssh 44071 1727204709.55089: Set connection var ansible_timeout to 10 44071 1727204709.55095: Set connection var ansible_pipelining to False 44071 1727204709.55100: Set connection var ansible_shell_type to sh 44071 1727204709.55106: Set connection var ansible_shell_executable to /bin/sh 44071 1727204709.55113: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204709.55136: variable 'ansible_shell_executable' from source: unknown 44071 1727204709.55139: variable 'ansible_connection' from source: unknown 44071 1727204709.55142: variable 'ansible_module_compression' from source: unknown 44071 1727204709.55144: variable 'ansible_shell_type' from source: unknown 44071 1727204709.55146: variable 'ansible_shell_executable' from source: unknown 44071 1727204709.55149: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204709.55152: variable 'ansible_pipelining' from source: unknown 44071 1727204709.55155: variable 'ansible_timeout' from source: unknown 44071 1727204709.55157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204709.55240: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204709.55249: variable 'omit' from source: magic vars 44071 1727204709.55252: starting attempt loop 44071 1727204709.55255: running the handler 44071 1727204709.55321: variable 'ansible_facts' from source: unknown 44071 1727204709.55867: _low_level_execute_command(): starting 44071 1727204709.55871: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204709.56427: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204709.56433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204709.56437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204709.56499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204709.56505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204709.56586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204709.58379: stdout chunk (state=3): >>>/root <<< 44071 1727204709.58478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204709.58547: stderr chunk (state=3): >>><<< 44071 1727204709.58551: stdout chunk (state=3): >>><<< 44071 1727204709.58568: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204709.58580: _low_level_execute_command(): starting 44071 1727204709.58586: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204709.585688-50914-214818321182992 `" && echo ansible-tmp-1727204709.585688-50914-214818321182992="` echo /root/.ansible/tmp/ansible-tmp-1727204709.585688-50914-214818321182992 `" ) && sleep 0' 44071 1727204709.59062: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204709.59069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204709.59098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204709.59102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204709.59105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204709.59107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204709.59171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204709.59175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204709.59177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204709.59254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204709.61280: stdout chunk (state=3): >>>ansible-tmp-1727204709.585688-50914-214818321182992=/root/.ansible/tmp/ansible-tmp-1727204709.585688-50914-214818321182992 <<< 44071 1727204709.61390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204709.61462: stderr chunk (state=3): >>><<< 44071 1727204709.61467: stdout chunk (state=3): >>><<< 44071 1727204709.61484: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204709.585688-50914-214818321182992=/root/.ansible/tmp/ansible-tmp-1727204709.585688-50914-214818321182992 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204709.61513: variable 'ansible_module_compression' from source: unknown 44071 1727204709.61564: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44071 1727204709.61618: variable 'ansible_facts' from source: unknown 44071 1727204709.61761: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204709.585688-50914-214818321182992/AnsiballZ_systemd.py 44071 1727204709.61883: Sending initial data 44071 1727204709.61886: Sent initial data (155 bytes) 44071 1727204709.62398: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204709.62403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204709.62410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204709.62412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204709.62471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204709.62474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204709.62481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204709.62593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204709.64253: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204709.64331: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204709.64412: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp9bvz6hem /root/.ansible/tmp/ansible-tmp-1727204709.585688-50914-214818321182992/AnsiballZ_systemd.py <<< 44071 1727204709.64429: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204709.585688-50914-214818321182992/AnsiballZ_systemd.py" <<< 44071 1727204709.64492: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp9bvz6hem" to remote "/root/.ansible/tmp/ansible-tmp-1727204709.585688-50914-214818321182992/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204709.585688-50914-214818321182992/AnsiballZ_systemd.py" <<< 44071 1727204709.66284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204709.66326: stderr chunk (state=3): >>><<< 44071 1727204709.66334: stdout chunk (state=3): >>><<< 44071 1727204709.66368: done transferring module to remote 44071 1727204709.66386: _low_level_execute_command(): starting 44071 1727204709.66412: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204709.585688-50914-214818321182992/ /root/.ansible/tmp/ansible-tmp-1727204709.585688-50914-214818321182992/AnsiballZ_systemd.py && sleep 0' 44071 1727204709.67201: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204709.67258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204709.67341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204709.69355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204709.69359: stdout chunk (state=3): >>><<< 44071 1727204709.69362: stderr chunk (state=3): >>><<< 44071 1727204709.69472: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204709.69476: _low_level_execute_command(): starting 44071 1727204709.69480: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204709.585688-50914-214818321182992/AnsiballZ_systemd.py && sleep 0' 44071 1727204709.70087: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204709.70118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204709.70130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204709.70172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204709.70277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204710.02691: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4595712", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3525287936", "CPUUsageNSec": "1652581000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitC<<< 44071 1727204710.02704: stdout chunk (state=3): >>>ORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44071 1727204710.04727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204710.04794: stderr chunk (state=3): >>><<< 44071 1727204710.04799: stdout chunk (state=3): >>><<< 44071 1727204710.04816: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4595712", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3525287936", "CPUUsageNSec": "1652581000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204710.04969: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204709.585688-50914-214818321182992/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204710.04986: _low_level_execute_command(): starting 44071 1727204710.04991: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204709.585688-50914-214818321182992/ > /dev/null 2>&1 && sleep 0' 44071 1727204710.05595: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204710.05621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204710.05720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204710.07681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204710.07748: stderr chunk (state=3): >>><<< 44071 1727204710.07752: stdout chunk (state=3): >>><<< 44071 1727204710.07764: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204710.07773: handler run complete 44071 1727204710.07820: attempt loop complete, returning result 44071 1727204710.07824: _execute() done 44071 1727204710.07826: dumping result to json 44071 1727204710.07844: done dumping result, returning 44071 1727204710.07853: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-c964-7471-000000001d37] 44071 1727204710.07858: sending task result for task 127b8e07-fff9-c964-7471-000000001d37 44071 1727204710.08148: done sending task result for task 127b8e07-fff9-c964-7471-000000001d37 44071 1727204710.08151: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204710.08216: no more pending results, returning what we have 44071 1727204710.08219: results queue empty 44071 1727204710.08220: checking for any_errors_fatal 44071 1727204710.08225: done checking for any_errors_fatal 44071 1727204710.08225: checking for max_fail_percentage 44071 1727204710.08227: done checking for max_fail_percentage 44071 1727204710.08228: checking to see if all hosts have failed and the running result is not ok 44071 1727204710.08229: done checking to see if all hosts have failed 44071 1727204710.08229: getting the remaining hosts for this loop 44071 1727204710.08231: done getting the remaining hosts for this loop 44071 1727204710.08238: getting the next task for host managed-node2 44071 1727204710.08245: done getting next task for host managed-node2 44071 1727204710.08249: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204710.08255: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204710.08275: getting variables 44071 1727204710.08277: in VariableManager get_vars() 44071 1727204710.08314: Calling all_inventory to load vars for managed-node2 44071 1727204710.08317: Calling groups_inventory to load vars for managed-node2 44071 1727204710.08319: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204710.08328: Calling all_plugins_play to load vars for managed-node2 44071 1727204710.08331: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204710.08336: Calling groups_plugins_play to load vars for managed-node2 44071 1727204710.10117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204710.11744: done with get_vars() 44071 1727204710.11779: done getting variables 44071 1727204710.11830: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:05:10 -0400 (0:00:00.649) 0:02:02.435 ***** 44071 1727204710.11867: entering _queue_task() for managed-node2/service 44071 1727204710.12174: worker is 1 (out of 1 available) 44071 1727204710.12191: exiting _queue_task() for managed-node2/service 44071 1727204710.12206: done queuing things up, now waiting for results queue to drain 44071 1727204710.12208: waiting for pending results... 44071 1727204710.12416: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204710.12541: in run() - task 127b8e07-fff9-c964-7471-000000001d38 44071 1727204710.12556: variable 'ansible_search_path' from source: unknown 44071 1727204710.12560: variable 'ansible_search_path' from source: unknown 44071 1727204710.12591: calling self._execute() 44071 1727204710.12682: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204710.12686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204710.12696: variable 'omit' from source: magic vars 44071 1727204710.13068: variable 'ansible_distribution_major_version' from source: facts 44071 1727204710.13261: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204710.13268: variable 'network_provider' from source: set_fact 44071 1727204710.13271: Evaluated conditional (network_provider == "nm"): True 44071 1727204710.13358: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204710.13477: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204710.13686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204710.15594: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204710.15652: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204710.15685: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204710.15712: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204710.15732: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204710.15815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204710.15836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204710.15859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204710.15894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204710.15905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204710.15945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204710.15965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204710.15985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204710.16012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204710.16024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204710.16058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204710.16079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204710.16099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204710.16125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204710.16135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204710.16259: variable 'network_connections' from source: include params 44071 1727204710.16277: variable 'interface' from source: play vars 44071 1727204710.16335: variable 'interface' from source: play vars 44071 1727204710.16396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204710.16525: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204710.16556: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204710.16584: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204710.16606: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204710.16646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204710.16663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204710.16683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204710.16701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204710.16748: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204710.16932: variable 'network_connections' from source: include params 44071 1727204710.16941: variable 'interface' from source: play vars 44071 1727204710.16993: variable 'interface' from source: play vars 44071 1727204710.17017: Evaluated conditional (__network_wpa_supplicant_required): False 44071 1727204710.17020: when evaluation is False, skipping this task 44071 1727204710.17023: _execute() done 44071 1727204710.17026: dumping result to json 44071 1727204710.17028: done dumping result, returning 44071 1727204710.17039: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-c964-7471-000000001d38] 44071 1727204710.17053: sending task result for task 127b8e07-fff9-c964-7471-000000001d38 44071 1727204710.17149: done sending task result for task 127b8e07-fff9-c964-7471-000000001d38 44071 1727204710.17152: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44071 1727204710.17212: no more pending results, returning what we have 44071 1727204710.17217: results queue empty 44071 1727204710.17218: checking for any_errors_fatal 44071 1727204710.17242: done checking for any_errors_fatal 44071 1727204710.17243: checking for max_fail_percentage 44071 1727204710.17245: done checking for max_fail_percentage 44071 1727204710.17246: checking to see if all hosts have failed and the running result is not ok 44071 1727204710.17247: done checking to see if all hosts have failed 44071 1727204710.17248: getting the remaining hosts for this loop 44071 1727204710.17249: done getting the remaining hosts for this loop 44071 1727204710.17254: getting the next task for host managed-node2 44071 1727204710.17262: done getting next task for host managed-node2 44071 1727204710.17269: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204710.17274: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204710.17299: getting variables 44071 1727204710.17300: in VariableManager get_vars() 44071 1727204710.17344: Calling all_inventory to load vars for managed-node2 44071 1727204710.17347: Calling groups_inventory to load vars for managed-node2 44071 1727204710.17349: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204710.17359: Calling all_plugins_play to load vars for managed-node2 44071 1727204710.17361: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204710.17364: Calling groups_plugins_play to load vars for managed-node2 44071 1727204710.18533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204710.19754: done with get_vars() 44071 1727204710.19787: done getting variables 44071 1727204710.19837: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:05:10 -0400 (0:00:00.080) 0:02:02.515 ***** 44071 1727204710.19870: entering _queue_task() for managed-node2/service 44071 1727204710.20179: worker is 1 (out of 1 available) 44071 1727204710.20195: exiting _queue_task() for managed-node2/service 44071 1727204710.20211: done queuing things up, now waiting for results queue to drain 44071 1727204710.20213: waiting for pending results... 44071 1727204710.20431: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204710.20562: in run() - task 127b8e07-fff9-c964-7471-000000001d39 44071 1727204710.20578: variable 'ansible_search_path' from source: unknown 44071 1727204710.20582: variable 'ansible_search_path' from source: unknown 44071 1727204710.20618: calling self._execute() 44071 1727204710.20708: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204710.20712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204710.20721: variable 'omit' from source: magic vars 44071 1727204710.21057: variable 'ansible_distribution_major_version' from source: facts 44071 1727204710.21070: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204710.21166: variable 'network_provider' from source: set_fact 44071 1727204710.21171: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204710.21174: when evaluation is False, skipping this task 44071 1727204710.21178: _execute() done 44071 1727204710.21180: dumping result to json 44071 1727204710.21183: done dumping result, returning 44071 1727204710.21192: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-c964-7471-000000001d39] 44071 1727204710.21197: sending task result for task 127b8e07-fff9-c964-7471-000000001d39 44071 1727204710.21307: done sending task result for task 127b8e07-fff9-c964-7471-000000001d39 44071 1727204710.21310: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204710.21363: no more pending results, returning what we have 44071 1727204710.21369: results queue empty 44071 1727204710.21370: checking for any_errors_fatal 44071 1727204710.21382: done checking for any_errors_fatal 44071 1727204710.21383: checking for max_fail_percentage 44071 1727204710.21385: done checking for max_fail_percentage 44071 1727204710.21386: checking to see if all hosts have failed and the running result is not ok 44071 1727204710.21387: done checking to see if all hosts have failed 44071 1727204710.21387: getting the remaining hosts for this loop 44071 1727204710.21389: done getting the remaining hosts for this loop 44071 1727204710.21394: getting the next task for host managed-node2 44071 1727204710.21404: done getting next task for host managed-node2 44071 1727204710.21409: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204710.21415: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204710.21444: getting variables 44071 1727204710.21446: in VariableManager get_vars() 44071 1727204710.21497: Calling all_inventory to load vars for managed-node2 44071 1727204710.21500: Calling groups_inventory to load vars for managed-node2 44071 1727204710.21502: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204710.21514: Calling all_plugins_play to load vars for managed-node2 44071 1727204710.21517: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204710.21520: Calling groups_plugins_play to load vars for managed-node2 44071 1727204710.22557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204710.23807: done with get_vars() 44071 1727204710.23838: done getting variables 44071 1727204710.23895: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:05:10 -0400 (0:00:00.040) 0:02:02.555 ***** 44071 1727204710.23925: entering _queue_task() for managed-node2/copy 44071 1727204710.24236: worker is 1 (out of 1 available) 44071 1727204710.24252: exiting _queue_task() for managed-node2/copy 44071 1727204710.24267: done queuing things up, now waiting for results queue to drain 44071 1727204710.24269: waiting for pending results... 44071 1727204710.24498: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204710.24618: in run() - task 127b8e07-fff9-c964-7471-000000001d3a 44071 1727204710.24630: variable 'ansible_search_path' from source: unknown 44071 1727204710.24634: variable 'ansible_search_path' from source: unknown 44071 1727204710.24675: calling self._execute() 44071 1727204710.24768: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204710.24772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204710.24782: variable 'omit' from source: magic vars 44071 1727204710.25125: variable 'ansible_distribution_major_version' from source: facts 44071 1727204710.25137: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204710.25234: variable 'network_provider' from source: set_fact 44071 1727204710.25242: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204710.25245: when evaluation is False, skipping this task 44071 1727204710.25248: _execute() done 44071 1727204710.25250: dumping result to json 44071 1727204710.25255: done dumping result, returning 44071 1727204710.25264: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-c964-7471-000000001d3a] 44071 1727204710.25274: sending task result for task 127b8e07-fff9-c964-7471-000000001d3a 44071 1727204710.25382: done sending task result for task 127b8e07-fff9-c964-7471-000000001d3a 44071 1727204710.25387: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44071 1727204710.25445: no more pending results, returning what we have 44071 1727204710.25449: results queue empty 44071 1727204710.25450: checking for any_errors_fatal 44071 1727204710.25459: done checking for any_errors_fatal 44071 1727204710.25459: checking for max_fail_percentage 44071 1727204710.25461: done checking for max_fail_percentage 44071 1727204710.25462: checking to see if all hosts have failed and the running result is not ok 44071 1727204710.25463: done checking to see if all hosts have failed 44071 1727204710.25464: getting the remaining hosts for this loop 44071 1727204710.25468: done getting the remaining hosts for this loop 44071 1727204710.25474: getting the next task for host managed-node2 44071 1727204710.25483: done getting next task for host managed-node2 44071 1727204710.25488: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204710.25494: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204710.25523: getting variables 44071 1727204710.25525: in VariableManager get_vars() 44071 1727204710.25575: Calling all_inventory to load vars for managed-node2 44071 1727204710.25578: Calling groups_inventory to load vars for managed-node2 44071 1727204710.25581: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204710.25592: Calling all_plugins_play to load vars for managed-node2 44071 1727204710.25595: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204710.25598: Calling groups_plugins_play to load vars for managed-node2 44071 1727204710.26780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204710.28023: done with get_vars() 44071 1727204710.28056: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:05:10 -0400 (0:00:00.042) 0:02:02.597 ***** 44071 1727204710.28134: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204710.28444: worker is 1 (out of 1 available) 44071 1727204710.28460: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204710.28477: done queuing things up, now waiting for results queue to drain 44071 1727204710.28479: waiting for pending results... 44071 1727204710.28695: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204710.28835: in run() - task 127b8e07-fff9-c964-7471-000000001d3b 44071 1727204710.28848: variable 'ansible_search_path' from source: unknown 44071 1727204710.28851: variable 'ansible_search_path' from source: unknown 44071 1727204710.28889: calling self._execute() 44071 1727204710.28981: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204710.28986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204710.28996: variable 'omit' from source: magic vars 44071 1727204710.29324: variable 'ansible_distribution_major_version' from source: facts 44071 1727204710.29337: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204710.29341: variable 'omit' from source: magic vars 44071 1727204710.29400: variable 'omit' from source: magic vars 44071 1727204710.29536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204710.31280: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204710.31336: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204710.31367: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204710.31394: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204710.31414: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204710.31493: variable 'network_provider' from source: set_fact 44071 1727204710.31602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204710.31626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204710.31647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204710.31681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204710.31692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204710.31751: variable 'omit' from source: magic vars 44071 1727204710.31842: variable 'omit' from source: magic vars 44071 1727204710.31924: variable 'network_connections' from source: include params 44071 1727204710.31938: variable 'interface' from source: play vars 44071 1727204710.31985: variable 'interface' from source: play vars 44071 1727204710.32103: variable 'omit' from source: magic vars 44071 1727204710.32113: variable '__lsr_ansible_managed' from source: task vars 44071 1727204710.32158: variable '__lsr_ansible_managed' from source: task vars 44071 1727204710.32320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44071 1727204710.32479: Loaded config def from plugin (lookup/template) 44071 1727204710.32484: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44071 1727204710.32506: File lookup term: get_ansible_managed.j2 44071 1727204710.32509: variable 'ansible_search_path' from source: unknown 44071 1727204710.32516: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44071 1727204710.32529: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44071 1727204710.32546: variable 'ansible_search_path' from source: unknown 44071 1727204710.37299: variable 'ansible_managed' from source: unknown 44071 1727204710.37425: variable 'omit' from source: magic vars 44071 1727204710.37452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204710.37476: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204710.37492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204710.37529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204710.37535: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204710.37543: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204710.37546: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204710.37551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204710.37624: Set connection var ansible_connection to ssh 44071 1727204710.37638: Set connection var ansible_timeout to 10 44071 1727204710.37642: Set connection var ansible_pipelining to False 44071 1727204710.37644: Set connection var ansible_shell_type to sh 44071 1727204710.37649: Set connection var ansible_shell_executable to /bin/sh 44071 1727204710.37656: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204710.37678: variable 'ansible_shell_executable' from source: unknown 44071 1727204710.37681: variable 'ansible_connection' from source: unknown 44071 1727204710.37684: variable 'ansible_module_compression' from source: unknown 44071 1727204710.37686: variable 'ansible_shell_type' from source: unknown 44071 1727204710.37689: variable 'ansible_shell_executable' from source: unknown 44071 1727204710.37691: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204710.37695: variable 'ansible_pipelining' from source: unknown 44071 1727204710.37697: variable 'ansible_timeout' from source: unknown 44071 1727204710.37702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204710.37816: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204710.37830: variable 'omit' from source: magic vars 44071 1727204710.37836: starting attempt loop 44071 1727204710.37839: running the handler 44071 1727204710.37853: _low_level_execute_command(): starting 44071 1727204710.37856: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204710.38403: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204710.38409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204710.38412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204710.38472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204710.38480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204710.38482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204710.38556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204710.40332: stdout chunk (state=3): >>>/root <<< 44071 1727204710.40440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204710.40508: stderr chunk (state=3): >>><<< 44071 1727204710.40512: stdout chunk (state=3): >>><<< 44071 1727204710.40531: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204710.40545: _low_level_execute_command(): starting 44071 1727204710.40551: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204710.4053261-50943-76146321987978 `" && echo ansible-tmp-1727204710.4053261-50943-76146321987978="` echo /root/.ansible/tmp/ansible-tmp-1727204710.4053261-50943-76146321987978 `" ) && sleep 0' 44071 1727204710.41070: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204710.41075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204710.41078: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204710.41080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204710.41083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204710.41138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204710.41141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204710.41218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204710.43222: stdout chunk (state=3): >>>ansible-tmp-1727204710.4053261-50943-76146321987978=/root/.ansible/tmp/ansible-tmp-1727204710.4053261-50943-76146321987978 <<< 44071 1727204710.43333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204710.43400: stderr chunk (state=3): >>><<< 44071 1727204710.43404: stdout chunk (state=3): >>><<< 44071 1727204710.43419: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204710.4053261-50943-76146321987978=/root/.ansible/tmp/ansible-tmp-1727204710.4053261-50943-76146321987978 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204710.43467: variable 'ansible_module_compression' from source: unknown 44071 1727204710.43507: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44071 1727204710.43539: variable 'ansible_facts' from source: unknown 44071 1727204710.43604: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204710.4053261-50943-76146321987978/AnsiballZ_network_connections.py 44071 1727204710.43718: Sending initial data 44071 1727204710.43731: Sent initial data (167 bytes) 44071 1727204710.44245: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204710.44250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204710.44257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204710.44260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204710.44315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204710.44323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204710.44326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204710.44394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204710.46001: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204710.46070: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204710.46137: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpz3yo19jn /root/.ansible/tmp/ansible-tmp-1727204710.4053261-50943-76146321987978/AnsiballZ_network_connections.py <<< 44071 1727204710.46141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204710.4053261-50943-76146321987978/AnsiballZ_network_connections.py" <<< 44071 1727204710.46201: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpz3yo19jn" to remote "/root/.ansible/tmp/ansible-tmp-1727204710.4053261-50943-76146321987978/AnsiballZ_network_connections.py" <<< 44071 1727204710.46205: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204710.4053261-50943-76146321987978/AnsiballZ_network_connections.py" <<< 44071 1727204710.47060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204710.47137: stderr chunk (state=3): >>><<< 44071 1727204710.47141: stdout chunk (state=3): >>><<< 44071 1727204710.47164: done transferring module to remote 44071 1727204710.47178: _low_level_execute_command(): starting 44071 1727204710.47184: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204710.4053261-50943-76146321987978/ /root/.ansible/tmp/ansible-tmp-1727204710.4053261-50943-76146321987978/AnsiballZ_network_connections.py && sleep 0' 44071 1727204710.47682: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204710.47686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204710.47689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204710.47691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204710.47747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204710.47755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204710.47824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204710.49666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204710.49723: stderr chunk (state=3): >>><<< 44071 1727204710.49727: stdout chunk (state=3): >>><<< 44071 1727204710.49746: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204710.49750: _low_level_execute_command(): starting 44071 1727204710.49755: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204710.4053261-50943-76146321987978/AnsiballZ_network_connections.py && sleep 0' 44071 1727204710.50260: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204710.50269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204710.50273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204710.50331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204710.50341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204710.50343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204710.50417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204710.90629: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44071 1727204710.93730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204710.93740: stdout chunk (state=3): >>><<< 44071 1727204710.93743: stderr chunk (state=3): >>><<< 44071 1727204710.93768: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204710.93838: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204710.4053261-50943-76146321987978/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204710.93842: _low_level_execute_command(): starting 44071 1727204710.93851: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204710.4053261-50943-76146321987978/ > /dev/null 2>&1 && sleep 0' 44071 1727204710.94559: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204710.94581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204710.94599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204710.94621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204710.94738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204710.94744: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204710.94746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204710.94758: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204710.94781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204710.94888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204710.97055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204710.97059: stdout chunk (state=3): >>><<< 44071 1727204710.97062: stderr chunk (state=3): >>><<< 44071 1727204710.97141: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204710.97145: handler run complete 44071 1727204710.97148: attempt loop complete, returning result 44071 1727204710.97150: _execute() done 44071 1727204710.97152: dumping result to json 44071 1727204710.97154: done dumping result, returning 44071 1727204710.97163: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-c964-7471-000000001d3b] 44071 1727204710.97173: sending task result for task 127b8e07-fff9-c964-7471-000000001d3b changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 44071 1727204710.97416: no more pending results, returning what we have 44071 1727204710.97423: results queue empty 44071 1727204710.97424: checking for any_errors_fatal 44071 1727204710.97429: done checking for any_errors_fatal 44071 1727204710.97430: checking for max_fail_percentage 44071 1727204710.97432: done checking for max_fail_percentage 44071 1727204710.97434: checking to see if all hosts have failed and the running result is not ok 44071 1727204710.97435: done checking to see if all hosts have failed 44071 1727204710.97436: getting the remaining hosts for this loop 44071 1727204710.97437: done getting the remaining hosts for this loop 44071 1727204710.97441: getting the next task for host managed-node2 44071 1727204710.97450: done getting next task for host managed-node2 44071 1727204710.97454: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204710.97459: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204710.97470: done sending task result for task 127b8e07-fff9-c964-7471-000000001d3b 44071 1727204710.97473: WORKER PROCESS EXITING 44071 1727204710.97481: getting variables 44071 1727204710.97483: in VariableManager get_vars() 44071 1727204710.97527: Calling all_inventory to load vars for managed-node2 44071 1727204710.97530: Calling groups_inventory to load vars for managed-node2 44071 1727204710.97532: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204710.97545: Calling all_plugins_play to load vars for managed-node2 44071 1727204710.97548: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204710.97551: Calling groups_plugins_play to load vars for managed-node2 44071 1727204710.98762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204711.00631: done with get_vars() 44071 1727204711.00668: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.726) 0:02:03.323 ***** 44071 1727204711.00745: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204711.01056: worker is 1 (out of 1 available) 44071 1727204711.01076: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204711.01091: done queuing things up, now waiting for results queue to drain 44071 1727204711.01093: waiting for pending results... 44071 1727204711.01308: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204711.01437: in run() - task 127b8e07-fff9-c964-7471-000000001d3c 44071 1727204711.01445: variable 'ansible_search_path' from source: unknown 44071 1727204711.01448: variable 'ansible_search_path' from source: unknown 44071 1727204711.01484: calling self._execute() 44071 1727204711.01574: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204711.01580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204711.01589: variable 'omit' from source: magic vars 44071 1727204711.01925: variable 'ansible_distribution_major_version' from source: facts 44071 1727204711.01938: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204711.02038: variable 'network_state' from source: role '' defaults 44071 1727204711.02045: Evaluated conditional (network_state != {}): False 44071 1727204711.02048: when evaluation is False, skipping this task 44071 1727204711.02051: _execute() done 44071 1727204711.02054: dumping result to json 44071 1727204711.02058: done dumping result, returning 44071 1727204711.02068: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-c964-7471-000000001d3c] 44071 1727204711.02073: sending task result for task 127b8e07-fff9-c964-7471-000000001d3c 44071 1727204711.02178: done sending task result for task 127b8e07-fff9-c964-7471-000000001d3c 44071 1727204711.02181: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204711.02242: no more pending results, returning what we have 44071 1727204711.02246: results queue empty 44071 1727204711.02247: checking for any_errors_fatal 44071 1727204711.02260: done checking for any_errors_fatal 44071 1727204711.02260: checking for max_fail_percentage 44071 1727204711.02262: done checking for max_fail_percentage 44071 1727204711.02263: checking to see if all hosts have failed and the running result is not ok 44071 1727204711.02264: done checking to see if all hosts have failed 44071 1727204711.02264: getting the remaining hosts for this loop 44071 1727204711.02268: done getting the remaining hosts for this loop 44071 1727204711.02273: getting the next task for host managed-node2 44071 1727204711.02282: done getting next task for host managed-node2 44071 1727204711.02287: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204711.02293: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204711.02319: getting variables 44071 1727204711.02321: in VariableManager get_vars() 44071 1727204711.02375: Calling all_inventory to load vars for managed-node2 44071 1727204711.02378: Calling groups_inventory to load vars for managed-node2 44071 1727204711.02381: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204711.02392: Calling all_plugins_play to load vars for managed-node2 44071 1727204711.02394: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204711.02397: Calling groups_plugins_play to load vars for managed-node2 44071 1727204711.04170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204711.05618: done with get_vars() 44071 1727204711.05647: done getting variables 44071 1727204711.05700: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.049) 0:02:03.373 ***** 44071 1727204711.05729: entering _queue_task() for managed-node2/debug 44071 1727204711.06050: worker is 1 (out of 1 available) 44071 1727204711.06072: exiting _queue_task() for managed-node2/debug 44071 1727204711.06087: done queuing things up, now waiting for results queue to drain 44071 1727204711.06089: waiting for pending results... 44071 1727204711.06399: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204711.06674: in run() - task 127b8e07-fff9-c964-7471-000000001d3d 44071 1727204711.06679: variable 'ansible_search_path' from source: unknown 44071 1727204711.06682: variable 'ansible_search_path' from source: unknown 44071 1727204711.06686: calling self._execute() 44071 1727204711.06741: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204711.06763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204711.06783: variable 'omit' from source: magic vars 44071 1727204711.07221: variable 'ansible_distribution_major_version' from source: facts 44071 1727204711.07244: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204711.07256: variable 'omit' from source: magic vars 44071 1727204711.07332: variable 'omit' from source: magic vars 44071 1727204711.07379: variable 'omit' from source: magic vars 44071 1727204711.07429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204711.07478: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204711.07507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204711.07531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204711.07557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204711.07599: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204711.07609: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204711.07620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204711.07750: Set connection var ansible_connection to ssh 44071 1727204711.07763: Set connection var ansible_timeout to 10 44071 1727204711.07777: Set connection var ansible_pipelining to False 44071 1727204711.07788: Set connection var ansible_shell_type to sh 44071 1727204711.07799: Set connection var ansible_shell_executable to /bin/sh 44071 1727204711.07870: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204711.07874: variable 'ansible_shell_executable' from source: unknown 44071 1727204711.07876: variable 'ansible_connection' from source: unknown 44071 1727204711.07879: variable 'ansible_module_compression' from source: unknown 44071 1727204711.07881: variable 'ansible_shell_type' from source: unknown 44071 1727204711.07883: variable 'ansible_shell_executable' from source: unknown 44071 1727204711.07885: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204711.07887: variable 'ansible_pipelining' from source: unknown 44071 1727204711.07890: variable 'ansible_timeout' from source: unknown 44071 1727204711.07893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204711.08051: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204711.08073: variable 'omit' from source: magic vars 44071 1727204711.08083: starting attempt loop 44071 1727204711.08091: running the handler 44071 1727204711.08252: variable '__network_connections_result' from source: set_fact 44071 1727204711.08321: handler run complete 44071 1727204711.08474: attempt loop complete, returning result 44071 1727204711.08477: _execute() done 44071 1727204711.08479: dumping result to json 44071 1727204711.08481: done dumping result, returning 44071 1727204711.08483: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-c964-7471-000000001d3d] 44071 1727204711.08485: sending task result for task 127b8e07-fff9-c964-7471-000000001d3d ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 44071 1727204711.08674: no more pending results, returning what we have 44071 1727204711.08678: results queue empty 44071 1727204711.08679: checking for any_errors_fatal 44071 1727204711.08686: done checking for any_errors_fatal 44071 1727204711.08686: checking for max_fail_percentage 44071 1727204711.08688: done checking for max_fail_percentage 44071 1727204711.08689: checking to see if all hosts have failed and the running result is not ok 44071 1727204711.08689: done checking to see if all hosts have failed 44071 1727204711.08690: getting the remaining hosts for this loop 44071 1727204711.08692: done getting the remaining hosts for this loop 44071 1727204711.08696: getting the next task for host managed-node2 44071 1727204711.08703: done getting next task for host managed-node2 44071 1727204711.08707: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204711.08712: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204711.08725: getting variables 44071 1727204711.08726: in VariableManager get_vars() 44071 1727204711.08781: Calling all_inventory to load vars for managed-node2 44071 1727204711.08784: Calling groups_inventory to load vars for managed-node2 44071 1727204711.08786: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204711.08792: done sending task result for task 127b8e07-fff9-c964-7471-000000001d3d 44071 1727204711.08796: WORKER PROCESS EXITING 44071 1727204711.08805: Calling all_plugins_play to load vars for managed-node2 44071 1727204711.08808: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204711.08811: Calling groups_plugins_play to load vars for managed-node2 44071 1727204711.09868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204711.11736: done with get_vars() 44071 1727204711.11773: done getting variables 44071 1727204711.11826: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.061) 0:02:03.435 ***** 44071 1727204711.11863: entering _queue_task() for managed-node2/debug 44071 1727204711.12163: worker is 1 (out of 1 available) 44071 1727204711.12180: exiting _queue_task() for managed-node2/debug 44071 1727204711.12195: done queuing things up, now waiting for results queue to drain 44071 1727204711.12197: waiting for pending results... 44071 1727204711.12409: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204711.12524: in run() - task 127b8e07-fff9-c964-7471-000000001d3e 44071 1727204711.12541: variable 'ansible_search_path' from source: unknown 44071 1727204711.12545: variable 'ansible_search_path' from source: unknown 44071 1727204711.12581: calling self._execute() 44071 1727204711.12671: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204711.12677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204711.12686: variable 'omit' from source: magic vars 44071 1727204711.13008: variable 'ansible_distribution_major_version' from source: facts 44071 1727204711.13020: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204711.13026: variable 'omit' from source: magic vars 44071 1727204711.13073: variable 'omit' from source: magic vars 44071 1727204711.13104: variable 'omit' from source: magic vars 44071 1727204711.13142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204711.13174: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204711.13194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204711.13211: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204711.13224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204711.13250: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204711.13254: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204711.13256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204711.13409: Set connection var ansible_connection to ssh 44071 1727204711.13412: Set connection var ansible_timeout to 10 44071 1727204711.13415: Set connection var ansible_pipelining to False 44071 1727204711.13419: Set connection var ansible_shell_type to sh 44071 1727204711.13422: Set connection var ansible_shell_executable to /bin/sh 44071 1727204711.13424: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204711.13426: variable 'ansible_shell_executable' from source: unknown 44071 1727204711.13428: variable 'ansible_connection' from source: unknown 44071 1727204711.13431: variable 'ansible_module_compression' from source: unknown 44071 1727204711.13436: variable 'ansible_shell_type' from source: unknown 44071 1727204711.13439: variable 'ansible_shell_executable' from source: unknown 44071 1727204711.13441: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204711.13443: variable 'ansible_pipelining' from source: unknown 44071 1727204711.13446: variable 'ansible_timeout' from source: unknown 44071 1727204711.13448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204711.13517: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204711.13526: variable 'omit' from source: magic vars 44071 1727204711.13531: starting attempt loop 44071 1727204711.13537: running the handler 44071 1727204711.13583: variable '__network_connections_result' from source: set_fact 44071 1727204711.13652: variable '__network_connections_result' from source: set_fact 44071 1727204711.13745: handler run complete 44071 1727204711.13764: attempt loop complete, returning result 44071 1727204711.13769: _execute() done 44071 1727204711.13772: dumping result to json 44071 1727204711.13778: done dumping result, returning 44071 1727204711.13788: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-c964-7471-000000001d3e] 44071 1727204711.13790: sending task result for task 127b8e07-fff9-c964-7471-000000001d3e ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 44071 1727204711.14010: no more pending results, returning what we have 44071 1727204711.14013: results queue empty 44071 1727204711.14014: checking for any_errors_fatal 44071 1727204711.14022: done checking for any_errors_fatal 44071 1727204711.14023: checking for max_fail_percentage 44071 1727204711.14024: done checking for max_fail_percentage 44071 1727204711.14025: checking to see if all hosts have failed and the running result is not ok 44071 1727204711.14026: done checking to see if all hosts have failed 44071 1727204711.14026: getting the remaining hosts for this loop 44071 1727204711.14028: done getting the remaining hosts for this loop 44071 1727204711.14032: getting the next task for host managed-node2 44071 1727204711.14042: done getting next task for host managed-node2 44071 1727204711.14046: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204711.14050: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204711.14062: getting variables 44071 1727204711.14063: in VariableManager get_vars() 44071 1727204711.14110: Calling all_inventory to load vars for managed-node2 44071 1727204711.14113: Calling groups_inventory to load vars for managed-node2 44071 1727204711.14115: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204711.14121: done sending task result for task 127b8e07-fff9-c964-7471-000000001d3e 44071 1727204711.14130: WORKER PROCESS EXITING 44071 1727204711.14142: Calling all_plugins_play to load vars for managed-node2 44071 1727204711.14145: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204711.14148: Calling groups_plugins_play to load vars for managed-node2 44071 1727204711.15370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204711.16594: done with get_vars() 44071 1727204711.16623: done getting variables 44071 1727204711.16679: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.048) 0:02:03.483 ***** 44071 1727204711.16708: entering _queue_task() for managed-node2/debug 44071 1727204711.17017: worker is 1 (out of 1 available) 44071 1727204711.17036: exiting _queue_task() for managed-node2/debug 44071 1727204711.17051: done queuing things up, now waiting for results queue to drain 44071 1727204711.17053: waiting for pending results... 44071 1727204711.17269: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204711.17382: in run() - task 127b8e07-fff9-c964-7471-000000001d3f 44071 1727204711.17399: variable 'ansible_search_path' from source: unknown 44071 1727204711.17403: variable 'ansible_search_path' from source: unknown 44071 1727204711.17438: calling self._execute() 44071 1727204711.17526: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204711.17532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204711.17541: variable 'omit' from source: magic vars 44071 1727204711.17875: variable 'ansible_distribution_major_version' from source: facts 44071 1727204711.17886: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204711.17984: variable 'network_state' from source: role '' defaults 44071 1727204711.17995: Evaluated conditional (network_state != {}): False 44071 1727204711.17998: when evaluation is False, skipping this task 44071 1727204711.18001: _execute() done 44071 1727204711.18004: dumping result to json 44071 1727204711.18007: done dumping result, returning 44071 1727204711.18016: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-c964-7471-000000001d3f] 44071 1727204711.18021: sending task result for task 127b8e07-fff9-c964-7471-000000001d3f 44071 1727204711.18127: done sending task result for task 127b8e07-fff9-c964-7471-000000001d3f 44071 1727204711.18129: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 44071 1727204711.18203: no more pending results, returning what we have 44071 1727204711.18208: results queue empty 44071 1727204711.18209: checking for any_errors_fatal 44071 1727204711.18220: done checking for any_errors_fatal 44071 1727204711.18220: checking for max_fail_percentage 44071 1727204711.18222: done checking for max_fail_percentage 44071 1727204711.18223: checking to see if all hosts have failed and the running result is not ok 44071 1727204711.18224: done checking to see if all hosts have failed 44071 1727204711.18224: getting the remaining hosts for this loop 44071 1727204711.18226: done getting the remaining hosts for this loop 44071 1727204711.18230: getting the next task for host managed-node2 44071 1727204711.18244: done getting next task for host managed-node2 44071 1727204711.18249: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204711.18255: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204711.18280: getting variables 44071 1727204711.18281: in VariableManager get_vars() 44071 1727204711.18323: Calling all_inventory to load vars for managed-node2 44071 1727204711.18326: Calling groups_inventory to load vars for managed-node2 44071 1727204711.18328: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204711.18339: Calling all_plugins_play to load vars for managed-node2 44071 1727204711.18342: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204711.18345: Calling groups_plugins_play to load vars for managed-node2 44071 1727204711.19406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204711.20801: done with get_vars() 44071 1727204711.20825: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.042) 0:02:03.525 ***** 44071 1727204711.20913: entering _queue_task() for managed-node2/ping 44071 1727204711.21227: worker is 1 (out of 1 available) 44071 1727204711.21246: exiting _queue_task() for managed-node2/ping 44071 1727204711.21261: done queuing things up, now waiting for results queue to drain 44071 1727204711.21263: waiting for pending results... 44071 1727204711.21476: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204711.21592: in run() - task 127b8e07-fff9-c964-7471-000000001d40 44071 1727204711.21611: variable 'ansible_search_path' from source: unknown 44071 1727204711.21616: variable 'ansible_search_path' from source: unknown 44071 1727204711.21652: calling self._execute() 44071 1727204711.21745: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204711.21748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204711.21757: variable 'omit' from source: magic vars 44071 1727204711.22094: variable 'ansible_distribution_major_version' from source: facts 44071 1727204711.22105: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204711.22112: variable 'omit' from source: magic vars 44071 1727204711.22163: variable 'omit' from source: magic vars 44071 1727204711.22192: variable 'omit' from source: magic vars 44071 1727204711.22229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204711.22267: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204711.22285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204711.22301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204711.22312: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204711.22337: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204711.22342: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204711.22344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204711.22427: Set connection var ansible_connection to ssh 44071 1727204711.22431: Set connection var ansible_timeout to 10 44071 1727204711.22441: Set connection var ansible_pipelining to False 44071 1727204711.22446: Set connection var ansible_shell_type to sh 44071 1727204711.22453: Set connection var ansible_shell_executable to /bin/sh 44071 1727204711.22459: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204711.22485: variable 'ansible_shell_executable' from source: unknown 44071 1727204711.22488: variable 'ansible_connection' from source: unknown 44071 1727204711.22492: variable 'ansible_module_compression' from source: unknown 44071 1727204711.22494: variable 'ansible_shell_type' from source: unknown 44071 1727204711.22496: variable 'ansible_shell_executable' from source: unknown 44071 1727204711.22499: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204711.22503: variable 'ansible_pipelining' from source: unknown 44071 1727204711.22505: variable 'ansible_timeout' from source: unknown 44071 1727204711.22510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204711.22695: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204711.22704: variable 'omit' from source: magic vars 44071 1727204711.22707: starting attempt loop 44071 1727204711.22710: running the handler 44071 1727204711.22724: _low_level_execute_command(): starting 44071 1727204711.22730: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204711.23311: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204711.23316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204711.23321: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204711.23323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204711.23375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204711.23380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204711.23461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204711.25240: stdout chunk (state=3): >>>/root <<< 44071 1727204711.25345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204711.25415: stderr chunk (state=3): >>><<< 44071 1727204711.25418: stdout chunk (state=3): >>><<< 44071 1727204711.25442: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204711.25457: _low_level_execute_command(): starting 44071 1727204711.25465: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204711.2544167-50974-194844500556673 `" && echo ansible-tmp-1727204711.2544167-50974-194844500556673="` echo /root/.ansible/tmp/ansible-tmp-1727204711.2544167-50974-194844500556673 `" ) && sleep 0' 44071 1727204711.25964: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204711.25995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204711.25999: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204711.26009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204711.26012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204711.26064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204711.26071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204711.26155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204711.28201: stdout chunk (state=3): >>>ansible-tmp-1727204711.2544167-50974-194844500556673=/root/.ansible/tmp/ansible-tmp-1727204711.2544167-50974-194844500556673 <<< 44071 1727204711.28303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204711.28371: stderr chunk (state=3): >>><<< 44071 1727204711.28375: stdout chunk (state=3): >>><<< 44071 1727204711.28393: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204711.2544167-50974-194844500556673=/root/.ansible/tmp/ansible-tmp-1727204711.2544167-50974-194844500556673 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204711.28449: variable 'ansible_module_compression' from source: unknown 44071 1727204711.28487: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44071 1727204711.28519: variable 'ansible_facts' from source: unknown 44071 1727204711.28583: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204711.2544167-50974-194844500556673/AnsiballZ_ping.py 44071 1727204711.28698: Sending initial data 44071 1727204711.28702: Sent initial data (153 bytes) 44071 1727204711.29228: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204711.29234: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204711.29290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204711.29295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204711.29370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204711.31041: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204711.31103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204711.31186: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp416pwuh2 /root/.ansible/tmp/ansible-tmp-1727204711.2544167-50974-194844500556673/AnsiballZ_ping.py <<< 44071 1727204711.31189: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204711.2544167-50974-194844500556673/AnsiballZ_ping.py" <<< 44071 1727204711.31246: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp416pwuh2" to remote "/root/.ansible/tmp/ansible-tmp-1727204711.2544167-50974-194844500556673/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204711.2544167-50974-194844500556673/AnsiballZ_ping.py" <<< 44071 1727204711.32023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204711.32101: stderr chunk (state=3): >>><<< 44071 1727204711.32105: stdout chunk (state=3): >>><<< 44071 1727204711.32124: done transferring module to remote 44071 1727204711.32136: _low_level_execute_command(): starting 44071 1727204711.32146: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204711.2544167-50974-194844500556673/ /root/.ansible/tmp/ansible-tmp-1727204711.2544167-50974-194844500556673/AnsiballZ_ping.py && sleep 0' 44071 1727204711.32639: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204711.32643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204711.32646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204711.32648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204711.32650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204711.32700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204711.32714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204711.32781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204711.34747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204711.34772: stderr chunk (state=3): >>><<< 44071 1727204711.34786: stdout chunk (state=3): >>><<< 44071 1727204711.34822: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204711.34825: _low_level_execute_command(): starting 44071 1727204711.34828: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204711.2544167-50974-194844500556673/AnsiballZ_ping.py && sleep 0' 44071 1727204711.35557: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204711.35579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204711.35601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204711.35622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204711.35641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204711.35683: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204711.35762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204711.35787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204711.35844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204711.35929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204711.52610: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44071 1727204711.54399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204711.54404: stdout chunk (state=3): >>><<< 44071 1727204711.54406: stderr chunk (state=3): >>><<< 44071 1727204711.54426: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204711.54559: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204711.2544167-50974-194844500556673/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204711.54563: _low_level_execute_command(): starting 44071 1727204711.54569: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204711.2544167-50974-194844500556673/ > /dev/null 2>&1 && sleep 0' 44071 1727204711.55188: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204711.55195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204711.55206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204711.55223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204711.55235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204711.55246: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204711.55255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204711.55271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204711.55279: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204711.55287: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204711.55388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204711.55391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204711.55407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204711.55419: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204711.55442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204711.55550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204711.57626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204711.57643: stdout chunk (state=3): >>><<< 44071 1727204711.57657: stderr chunk (state=3): >>><<< 44071 1727204711.57685: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204711.57698: handler run complete 44071 1727204711.57730: attempt loop complete, returning result 44071 1727204711.57742: _execute() done 44071 1727204711.57828: dumping result to json 44071 1727204711.57831: done dumping result, returning 44071 1727204711.57837: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-c964-7471-000000001d40] 44071 1727204711.57839: sending task result for task 127b8e07-fff9-c964-7471-000000001d40 44071 1727204711.57922: done sending task result for task 127b8e07-fff9-c964-7471-000000001d40 44071 1727204711.57925: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 44071 1727204711.58126: no more pending results, returning what we have 44071 1727204711.58131: results queue empty 44071 1727204711.58132: checking for any_errors_fatal 44071 1727204711.58144: done checking for any_errors_fatal 44071 1727204711.58145: checking for max_fail_percentage 44071 1727204711.58147: done checking for max_fail_percentage 44071 1727204711.58148: checking to see if all hosts have failed and the running result is not ok 44071 1727204711.58149: done checking to see if all hosts have failed 44071 1727204711.58150: getting the remaining hosts for this loop 44071 1727204711.58152: done getting the remaining hosts for this loop 44071 1727204711.58158: getting the next task for host managed-node2 44071 1727204711.58278: done getting next task for host managed-node2 44071 1727204711.58282: ^ task is: TASK: meta (role_complete) 44071 1727204711.58288: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204711.58304: getting variables 44071 1727204711.58306: in VariableManager get_vars() 44071 1727204711.58363: Calling all_inventory to load vars for managed-node2 44071 1727204711.58486: Calling groups_inventory to load vars for managed-node2 44071 1727204711.58490: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204711.58502: Calling all_plugins_play to load vars for managed-node2 44071 1727204711.58506: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204711.58509: Calling groups_plugins_play to load vars for managed-node2 44071 1727204711.60517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204711.63882: done with get_vars() 44071 1727204711.63930: done getting variables 44071 1727204711.64040: done queuing things up, now waiting for results queue to drain 44071 1727204711.64043: results queue empty 44071 1727204711.64043: checking for any_errors_fatal 44071 1727204711.64047: done checking for any_errors_fatal 44071 1727204711.64048: checking for max_fail_percentage 44071 1727204711.64049: done checking for max_fail_percentage 44071 1727204711.64050: checking to see if all hosts have failed and the running result is not ok 44071 1727204711.64051: done checking to see if all hosts have failed 44071 1727204711.64052: getting the remaining hosts for this loop 44071 1727204711.64053: done getting the remaining hosts for this loop 44071 1727204711.64055: getting the next task for host managed-node2 44071 1727204711.64062: done getting next task for host managed-node2 44071 1727204711.64066: ^ task is: TASK: Asserts 44071 1727204711.64069: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204711.64072: getting variables 44071 1727204711.64073: in VariableManager get_vars() 44071 1727204711.64088: Calling all_inventory to load vars for managed-node2 44071 1727204711.64091: Calling groups_inventory to load vars for managed-node2 44071 1727204711.64094: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204711.64100: Calling all_plugins_play to load vars for managed-node2 44071 1727204711.64102: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204711.64105: Calling groups_plugins_play to load vars for managed-node2 44071 1727204711.65941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204711.68285: done with get_vars() 44071 1727204711.68336: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.475) 0:02:04.000 ***** 44071 1727204711.68429: entering _queue_task() for managed-node2/include_tasks 44071 1727204711.69090: worker is 1 (out of 1 available) 44071 1727204711.69105: exiting _queue_task() for managed-node2/include_tasks 44071 1727204711.69118: done queuing things up, now waiting for results queue to drain 44071 1727204711.69120: waiting for pending results... 44071 1727204711.69532: running TaskExecutor() for managed-node2/TASK: Asserts 44071 1727204711.69551: in run() - task 127b8e07-fff9-c964-7471-000000001749 44071 1727204711.69625: variable 'ansible_search_path' from source: unknown 44071 1727204711.69629: variable 'ansible_search_path' from source: unknown 44071 1727204711.69640: variable 'lsr_assert' from source: include params 44071 1727204711.69908: variable 'lsr_assert' from source: include params 44071 1727204711.70001: variable 'omit' from source: magic vars 44071 1727204711.70572: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204711.70576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204711.70579: variable 'omit' from source: magic vars 44071 1727204711.70582: variable 'ansible_distribution_major_version' from source: facts 44071 1727204711.70584: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204711.70586: variable 'item' from source: unknown 44071 1727204711.70631: variable 'item' from source: unknown 44071 1727204711.70670: variable 'item' from source: unknown 44071 1727204711.70740: variable 'item' from source: unknown 44071 1727204711.71164: dumping result to json 44071 1727204711.71170: done dumping result, returning 44071 1727204711.71172: done running TaskExecutor() for managed-node2/TASK: Asserts [127b8e07-fff9-c964-7471-000000001749] 44071 1727204711.71175: sending task result for task 127b8e07-fff9-c964-7471-000000001749 44071 1727204711.71227: done sending task result for task 127b8e07-fff9-c964-7471-000000001749 44071 1727204711.71231: WORKER PROCESS EXITING 44071 1727204711.71259: no more pending results, returning what we have 44071 1727204711.71264: in VariableManager get_vars() 44071 1727204711.71309: Calling all_inventory to load vars for managed-node2 44071 1727204711.71312: Calling groups_inventory to load vars for managed-node2 44071 1727204711.71316: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204711.71328: Calling all_plugins_play to load vars for managed-node2 44071 1727204711.71331: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204711.71335: Calling groups_plugins_play to load vars for managed-node2 44071 1727204711.73090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204711.75468: done with get_vars() 44071 1727204711.75495: variable 'ansible_search_path' from source: unknown 44071 1727204711.75497: variable 'ansible_search_path' from source: unknown 44071 1727204711.75541: we have included files to process 44071 1727204711.75543: generating all_blocks data 44071 1727204711.75545: done generating all_blocks data 44071 1727204711.75552: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 44071 1727204711.75553: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 44071 1727204711.75556: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 44071 1727204711.75682: in VariableManager get_vars() 44071 1727204711.75708: done with get_vars() 44071 1727204711.75830: done processing included file 44071 1727204711.75832: iterating over new_blocks loaded from include file 44071 1727204711.75833: in VariableManager get_vars() 44071 1727204711.75851: done with get_vars() 44071 1727204711.75853: filtering new block on tags 44071 1727204711.75895: done filtering new block on tags 44071 1727204711.75898: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node2 => (item=tasks/assert_profile_absent.yml) 44071 1727204711.75904: extending task lists for all hosts with included blocks 44071 1727204711.77273: done extending task lists 44071 1727204711.77275: done processing included files 44071 1727204711.77276: results queue empty 44071 1727204711.77277: checking for any_errors_fatal 44071 1727204711.77279: done checking for any_errors_fatal 44071 1727204711.77280: checking for max_fail_percentage 44071 1727204711.77281: done checking for max_fail_percentage 44071 1727204711.77282: checking to see if all hosts have failed and the running result is not ok 44071 1727204711.77283: done checking to see if all hosts have failed 44071 1727204711.77284: getting the remaining hosts for this loop 44071 1727204711.77285: done getting the remaining hosts for this loop 44071 1727204711.77288: getting the next task for host managed-node2 44071 1727204711.77293: done getting next task for host managed-node2 44071 1727204711.77295: ^ task is: TASK: Include the task 'get_profile_stat.yml' 44071 1727204711.77299: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204711.77302: getting variables 44071 1727204711.77303: in VariableManager get_vars() 44071 1727204711.77325: Calling all_inventory to load vars for managed-node2 44071 1727204711.77328: Calling groups_inventory to load vars for managed-node2 44071 1727204711.77331: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204711.77339: Calling all_plugins_play to load vars for managed-node2 44071 1727204711.77341: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204711.77345: Calling groups_plugins_play to load vars for managed-node2 44071 1727204711.80193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204711.83128: done with get_vars() 44071 1727204711.83180: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.148) 0:02:04.149 ***** 44071 1727204711.83281: entering _queue_task() for managed-node2/include_tasks 44071 1727204711.83991: worker is 1 (out of 1 available) 44071 1727204711.84003: exiting _queue_task() for managed-node2/include_tasks 44071 1727204711.84017: done queuing things up, now waiting for results queue to drain 44071 1727204711.84019: waiting for pending results... 44071 1727204711.84196: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 44071 1727204711.84299: in run() - task 127b8e07-fff9-c964-7471-000000001e99 44071 1727204711.84575: variable 'ansible_search_path' from source: unknown 44071 1727204711.84580: variable 'ansible_search_path' from source: unknown 44071 1727204711.84585: calling self._execute() 44071 1727204711.84589: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204711.84592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204711.84596: variable 'omit' from source: magic vars 44071 1727204711.84987: variable 'ansible_distribution_major_version' from source: facts 44071 1727204711.85001: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204711.85025: _execute() done 44071 1727204711.85030: dumping result to json 44071 1727204711.85033: done dumping result, returning 44071 1727204711.85036: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-c964-7471-000000001e99] 44071 1727204711.85038: sending task result for task 127b8e07-fff9-c964-7471-000000001e99 44071 1727204711.85205: done sending task result for task 127b8e07-fff9-c964-7471-000000001e99 44071 1727204711.85207: WORKER PROCESS EXITING 44071 1727204711.85261: no more pending results, returning what we have 44071 1727204711.85269: in VariableManager get_vars() 44071 1727204711.85340: Calling all_inventory to load vars for managed-node2 44071 1727204711.85344: Calling groups_inventory to load vars for managed-node2 44071 1727204711.85348: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204711.85368: Calling all_plugins_play to load vars for managed-node2 44071 1727204711.85372: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204711.85375: Calling groups_plugins_play to load vars for managed-node2 44071 1727204711.87630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204711.90889: done with get_vars() 44071 1727204711.90935: variable 'ansible_search_path' from source: unknown 44071 1727204711.90937: variable 'ansible_search_path' from source: unknown 44071 1727204711.90950: variable 'item' from source: include params 44071 1727204711.91085: variable 'item' from source: include params 44071 1727204711.91123: we have included files to process 44071 1727204711.91124: generating all_blocks data 44071 1727204711.91126: done generating all_blocks data 44071 1727204711.91127: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204711.91129: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204711.91131: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204711.93007: done processing included file 44071 1727204711.93010: iterating over new_blocks loaded from include file 44071 1727204711.93011: in VariableManager get_vars() 44071 1727204711.93039: done with get_vars() 44071 1727204711.93041: filtering new block on tags 44071 1727204711.93122: done filtering new block on tags 44071 1727204711.93126: in VariableManager get_vars() 44071 1727204711.93150: done with get_vars() 44071 1727204711.93152: filtering new block on tags 44071 1727204711.93255: done filtering new block on tags 44071 1727204711.93258: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 44071 1727204711.93264: extending task lists for all hosts with included blocks 44071 1727204711.93561: done extending task lists 44071 1727204711.93563: done processing included files 44071 1727204711.93564: results queue empty 44071 1727204711.93564: checking for any_errors_fatal 44071 1727204711.93774: done checking for any_errors_fatal 44071 1727204711.93775: checking for max_fail_percentage 44071 1727204711.93777: done checking for max_fail_percentage 44071 1727204711.93778: checking to see if all hosts have failed and the running result is not ok 44071 1727204711.93778: done checking to see if all hosts have failed 44071 1727204711.93779: getting the remaining hosts for this loop 44071 1727204711.93781: done getting the remaining hosts for this loop 44071 1727204711.93784: getting the next task for host managed-node2 44071 1727204711.93790: done getting next task for host managed-node2 44071 1727204711.93793: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 44071 1727204711.93797: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204711.93800: getting variables 44071 1727204711.93801: in VariableManager get_vars() 44071 1727204711.93817: Calling all_inventory to load vars for managed-node2 44071 1727204711.93819: Calling groups_inventory to load vars for managed-node2 44071 1727204711.93822: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204711.93830: Calling all_plugins_play to load vars for managed-node2 44071 1727204711.93835: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204711.93838: Calling groups_plugins_play to load vars for managed-node2 44071 1727204711.96258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204712.06909: done with get_vars() 44071 1727204712.06940: done getting variables 44071 1727204712.06984: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:05:12 -0400 (0:00:00.237) 0:02:04.386 ***** 44071 1727204712.07009: entering _queue_task() for managed-node2/set_fact 44071 1727204712.07339: worker is 1 (out of 1 available) 44071 1727204712.07354: exiting _queue_task() for managed-node2/set_fact 44071 1727204712.07372: done queuing things up, now waiting for results queue to drain 44071 1727204712.07375: waiting for pending results... 44071 1727204712.07576: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 44071 1727204712.07703: in run() - task 127b8e07-fff9-c964-7471-000000001f17 44071 1727204712.07721: variable 'ansible_search_path' from source: unknown 44071 1727204712.07727: variable 'ansible_search_path' from source: unknown 44071 1727204712.07761: calling self._execute() 44071 1727204712.07854: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204712.07862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204712.07873: variable 'omit' from source: magic vars 44071 1727204712.08312: variable 'ansible_distribution_major_version' from source: facts 44071 1727204712.08316: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204712.08319: variable 'omit' from source: magic vars 44071 1727204712.08576: variable 'omit' from source: magic vars 44071 1727204712.08580: variable 'omit' from source: magic vars 44071 1727204712.08583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204712.08587: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204712.08590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204712.08593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204712.08595: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204712.08600: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204712.08603: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204712.08606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204712.08707: Set connection var ansible_connection to ssh 44071 1727204712.08738: Set connection var ansible_timeout to 10 44071 1727204712.08741: Set connection var ansible_pipelining to False 44071 1727204712.08744: Set connection var ansible_shell_type to sh 44071 1727204712.08746: Set connection var ansible_shell_executable to /bin/sh 44071 1727204712.08749: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204712.08847: variable 'ansible_shell_executable' from source: unknown 44071 1727204712.08851: variable 'ansible_connection' from source: unknown 44071 1727204712.08854: variable 'ansible_module_compression' from source: unknown 44071 1727204712.08857: variable 'ansible_shell_type' from source: unknown 44071 1727204712.08859: variable 'ansible_shell_executable' from source: unknown 44071 1727204712.08862: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204712.08865: variable 'ansible_pipelining' from source: unknown 44071 1727204712.08869: variable 'ansible_timeout' from source: unknown 44071 1727204712.08871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204712.08952: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204712.08960: variable 'omit' from source: magic vars 44071 1727204712.08964: starting attempt loop 44071 1727204712.08969: running the handler 44071 1727204712.09008: handler run complete 44071 1727204712.09011: attempt loop complete, returning result 44071 1727204712.09014: _execute() done 44071 1727204712.09016: dumping result to json 44071 1727204712.09018: done dumping result, returning 44071 1727204712.09020: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-c964-7471-000000001f17] 44071 1727204712.09023: sending task result for task 127b8e07-fff9-c964-7471-000000001f17 ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 44071 1727204712.09282: no more pending results, returning what we have 44071 1727204712.09286: results queue empty 44071 1727204712.09291: checking for any_errors_fatal 44071 1727204712.09294: done checking for any_errors_fatal 44071 1727204712.09294: checking for max_fail_percentage 44071 1727204712.09296: done checking for max_fail_percentage 44071 1727204712.09297: checking to see if all hosts have failed and the running result is not ok 44071 1727204712.09298: done checking to see if all hosts have failed 44071 1727204712.09299: getting the remaining hosts for this loop 44071 1727204712.09300: done getting the remaining hosts for this loop 44071 1727204712.09305: getting the next task for host managed-node2 44071 1727204712.09314: done getting next task for host managed-node2 44071 1727204712.09317: ^ task is: TASK: Stat profile file 44071 1727204712.09324: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204712.09329: getting variables 44071 1727204712.09331: in VariableManager get_vars() 44071 1727204712.09390: Calling all_inventory to load vars for managed-node2 44071 1727204712.09394: Calling groups_inventory to load vars for managed-node2 44071 1727204712.09403: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204712.09417: Calling all_plugins_play to load vars for managed-node2 44071 1727204712.09421: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204712.09425: Calling groups_plugins_play to load vars for managed-node2 44071 1727204712.09994: done sending task result for task 127b8e07-fff9-c964-7471-000000001f17 44071 1727204712.09998: WORKER PROCESS EXITING 44071 1727204712.10883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204712.12252: done with get_vars() 44071 1727204712.12298: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:05:12 -0400 (0:00:00.053) 0:02:04.440 ***** 44071 1727204712.12411: entering _queue_task() for managed-node2/stat 44071 1727204712.13076: worker is 1 (out of 1 available) 44071 1727204712.13089: exiting _queue_task() for managed-node2/stat 44071 1727204712.13107: done queuing things up, now waiting for results queue to drain 44071 1727204712.13109: waiting for pending results... 44071 1727204712.13287: running TaskExecutor() for managed-node2/TASK: Stat profile file 44071 1727204712.13445: in run() - task 127b8e07-fff9-c964-7471-000000001f18 44071 1727204712.13469: variable 'ansible_search_path' from source: unknown 44071 1727204712.13474: variable 'ansible_search_path' from source: unknown 44071 1727204712.13505: calling self._execute() 44071 1727204712.13650: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204712.13654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204712.13658: variable 'omit' from source: magic vars 44071 1727204712.14138: variable 'ansible_distribution_major_version' from source: facts 44071 1727204712.14142: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204712.14145: variable 'omit' from source: magic vars 44071 1727204712.14176: variable 'omit' from source: magic vars 44071 1727204712.14293: variable 'profile' from source: play vars 44071 1727204712.14297: variable 'interface' from source: play vars 44071 1727204712.14376: variable 'interface' from source: play vars 44071 1727204712.14405: variable 'omit' from source: magic vars 44071 1727204712.14464: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204712.14570: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204712.14573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204712.14580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204712.14584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204712.14587: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204712.14589: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204712.14592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204712.14699: Set connection var ansible_connection to ssh 44071 1727204712.14771: Set connection var ansible_timeout to 10 44071 1727204712.14775: Set connection var ansible_pipelining to False 44071 1727204712.14777: Set connection var ansible_shell_type to sh 44071 1727204712.14780: Set connection var ansible_shell_executable to /bin/sh 44071 1727204712.14783: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204712.14785: variable 'ansible_shell_executable' from source: unknown 44071 1727204712.14788: variable 'ansible_connection' from source: unknown 44071 1727204712.14791: variable 'ansible_module_compression' from source: unknown 44071 1727204712.14794: variable 'ansible_shell_type' from source: unknown 44071 1727204712.14797: variable 'ansible_shell_executable' from source: unknown 44071 1727204712.14799: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204712.14801: variable 'ansible_pipelining' from source: unknown 44071 1727204712.14804: variable 'ansible_timeout' from source: unknown 44071 1727204712.14806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204712.15039: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204712.15044: variable 'omit' from source: magic vars 44071 1727204712.15047: starting attempt loop 44071 1727204712.15050: running the handler 44071 1727204712.15105: _low_level_execute_command(): starting 44071 1727204712.15108: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204712.15716: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204712.15721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204712.15725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.15773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204712.15777: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204712.15857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204712.17643: stdout chunk (state=3): >>>/root <<< 44071 1727204712.17788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204712.17976: stderr chunk (state=3): >>><<< 44071 1727204712.17982: stdout chunk (state=3): >>><<< 44071 1727204712.17986: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204712.17990: _low_level_execute_command(): starting 44071 1727204712.17993: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204712.1787026-51011-213054879714473 `" && echo ansible-tmp-1727204712.1787026-51011-213054879714473="` echo /root/.ansible/tmp/ansible-tmp-1727204712.1787026-51011-213054879714473 `" ) && sleep 0' 44071 1727204712.18474: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204712.18490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204712.18509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.18553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204712.18576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204712.18648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204712.20630: stdout chunk (state=3): >>>ansible-tmp-1727204712.1787026-51011-213054879714473=/root/.ansible/tmp/ansible-tmp-1727204712.1787026-51011-213054879714473 <<< 44071 1727204712.20747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204712.20814: stderr chunk (state=3): >>><<< 44071 1727204712.20818: stdout chunk (state=3): >>><<< 44071 1727204712.20836: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204712.1787026-51011-213054879714473=/root/.ansible/tmp/ansible-tmp-1727204712.1787026-51011-213054879714473 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204712.20886: variable 'ansible_module_compression' from source: unknown 44071 1727204712.20941: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44071 1727204712.20979: variable 'ansible_facts' from source: unknown 44071 1727204712.21032: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204712.1787026-51011-213054879714473/AnsiballZ_stat.py 44071 1727204712.21153: Sending initial data 44071 1727204712.21156: Sent initial data (153 bytes) 44071 1727204712.21649: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204712.21654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204712.21687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.21691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204712.21695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.21744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204712.21748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204712.21760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204712.21835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204712.23460: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204712.23523: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204712.23596: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpvku6ch2x /root/.ansible/tmp/ansible-tmp-1727204712.1787026-51011-213054879714473/AnsiballZ_stat.py <<< 44071 1727204712.23605: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204712.1787026-51011-213054879714473/AnsiballZ_stat.py" <<< 44071 1727204712.23665: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpvku6ch2x" to remote "/root/.ansible/tmp/ansible-tmp-1727204712.1787026-51011-213054879714473/AnsiballZ_stat.py" <<< 44071 1727204712.23670: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204712.1787026-51011-213054879714473/AnsiballZ_stat.py" <<< 44071 1727204712.24492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204712.24578: stderr chunk (state=3): >>><<< 44071 1727204712.24589: stdout chunk (state=3): >>><<< 44071 1727204712.24625: done transferring module to remote 44071 1727204712.24731: _low_level_execute_command(): starting 44071 1727204712.24738: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204712.1787026-51011-213054879714473/ /root/.ansible/tmp/ansible-tmp-1727204712.1787026-51011-213054879714473/AnsiballZ_stat.py && sleep 0' 44071 1727204712.25236: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204712.25252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204712.25262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.25314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204712.25326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204712.25340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204712.25426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204712.27425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204712.27430: stdout chunk (state=3): >>><<< 44071 1727204712.27432: stderr chunk (state=3): >>><<< 44071 1727204712.27694: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204712.27699: _low_level_execute_command(): starting 44071 1727204712.27702: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204712.1787026-51011-213054879714473/AnsiballZ_stat.py && sleep 0' 44071 1727204712.28512: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204712.28522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.28635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204712.28676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204712.28789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204712.45292: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44071 1727204712.46629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204712.46637: stdout chunk (state=3): >>><<< 44071 1727204712.46640: stderr chunk (state=3): >>><<< 44071 1727204712.46803: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204712.46808: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204712.1787026-51011-213054879714473/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204712.46811: _low_level_execute_command(): starting 44071 1727204712.46814: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204712.1787026-51011-213054879714473/ > /dev/null 2>&1 && sleep 0' 44071 1727204712.47469: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204712.47606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204712.47610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.47647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204712.47670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204712.47695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204712.47812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204712.49790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204712.49795: stdout chunk (state=3): >>><<< 44071 1727204712.49802: stderr chunk (state=3): >>><<< 44071 1727204712.49820: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204712.49826: handler run complete 44071 1727204712.49844: attempt loop complete, returning result 44071 1727204712.49848: _execute() done 44071 1727204712.49851: dumping result to json 44071 1727204712.49854: done dumping result, returning 44071 1727204712.49862: done running TaskExecutor() for managed-node2/TASK: Stat profile file [127b8e07-fff9-c964-7471-000000001f18] 44071 1727204712.49868: sending task result for task 127b8e07-fff9-c964-7471-000000001f18 44071 1727204712.49992: done sending task result for task 127b8e07-fff9-c964-7471-000000001f18 44071 1727204712.49995: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 44071 1727204712.50071: no more pending results, returning what we have 44071 1727204712.50076: results queue empty 44071 1727204712.50077: checking for any_errors_fatal 44071 1727204712.50084: done checking for any_errors_fatal 44071 1727204712.50085: checking for max_fail_percentage 44071 1727204712.50087: done checking for max_fail_percentage 44071 1727204712.50088: checking to see if all hosts have failed and the running result is not ok 44071 1727204712.50089: done checking to see if all hosts have failed 44071 1727204712.50089: getting the remaining hosts for this loop 44071 1727204712.50091: done getting the remaining hosts for this loop 44071 1727204712.50096: getting the next task for host managed-node2 44071 1727204712.50104: done getting next task for host managed-node2 44071 1727204712.50114: ^ task is: TASK: Set NM profile exist flag based on the profile files 44071 1727204712.50120: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204712.50124: getting variables 44071 1727204712.50125: in VariableManager get_vars() 44071 1727204712.50173: Calling all_inventory to load vars for managed-node2 44071 1727204712.50176: Calling groups_inventory to load vars for managed-node2 44071 1727204712.50180: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204712.50192: Calling all_plugins_play to load vars for managed-node2 44071 1727204712.50195: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204712.50197: Calling groups_plugins_play to load vars for managed-node2 44071 1727204712.51860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204712.53182: done with get_vars() 44071 1727204712.53212: done getting variables 44071 1727204712.53272: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:05:12 -0400 (0:00:00.408) 0:02:04.849 ***** 44071 1727204712.53299: entering _queue_task() for managed-node2/set_fact 44071 1727204712.53614: worker is 1 (out of 1 available) 44071 1727204712.53632: exiting _queue_task() for managed-node2/set_fact 44071 1727204712.53651: done queuing things up, now waiting for results queue to drain 44071 1727204712.53652: waiting for pending results... 44071 1727204712.53873: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 44071 1727204712.53997: in run() - task 127b8e07-fff9-c964-7471-000000001f19 44071 1727204712.54015: variable 'ansible_search_path' from source: unknown 44071 1727204712.54019: variable 'ansible_search_path' from source: unknown 44071 1727204712.54052: calling self._execute() 44071 1727204712.54151: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204712.54157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204712.54168: variable 'omit' from source: magic vars 44071 1727204712.54600: variable 'ansible_distribution_major_version' from source: facts 44071 1727204712.54604: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204712.54887: variable 'profile_stat' from source: set_fact 44071 1727204712.54890: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204712.54892: when evaluation is False, skipping this task 44071 1727204712.54894: _execute() done 44071 1727204712.54897: dumping result to json 44071 1727204712.54899: done dumping result, returning 44071 1727204712.54902: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-c964-7471-000000001f19] 44071 1727204712.54904: sending task result for task 127b8e07-fff9-c964-7471-000000001f19 44071 1727204712.54989: done sending task result for task 127b8e07-fff9-c964-7471-000000001f19 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204712.55040: no more pending results, returning what we have 44071 1727204712.55043: results queue empty 44071 1727204712.55044: checking for any_errors_fatal 44071 1727204712.55053: done checking for any_errors_fatal 44071 1727204712.55054: checking for max_fail_percentage 44071 1727204712.55056: done checking for max_fail_percentage 44071 1727204712.55056: checking to see if all hosts have failed and the running result is not ok 44071 1727204712.55057: done checking to see if all hosts have failed 44071 1727204712.55058: getting the remaining hosts for this loop 44071 1727204712.55059: done getting the remaining hosts for this loop 44071 1727204712.55064: getting the next task for host managed-node2 44071 1727204712.55073: done getting next task for host managed-node2 44071 1727204712.55077: ^ task is: TASK: Get NM profile info 44071 1727204712.55082: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204712.55086: getting variables 44071 1727204712.55088: in VariableManager get_vars() 44071 1727204712.55127: Calling all_inventory to load vars for managed-node2 44071 1727204712.55130: Calling groups_inventory to load vars for managed-node2 44071 1727204712.55133: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204712.55144: Calling all_plugins_play to load vars for managed-node2 44071 1727204712.55147: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204712.55149: Calling groups_plugins_play to load vars for managed-node2 44071 1727204712.55686: WORKER PROCESS EXITING 44071 1727204712.56861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204712.58076: done with get_vars() 44071 1727204712.58106: done getting variables 44071 1727204712.58184: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:05:12 -0400 (0:00:00.049) 0:02:04.898 ***** 44071 1727204712.58215: entering _queue_task() for managed-node2/shell 44071 1727204712.58615: worker is 1 (out of 1 available) 44071 1727204712.58633: exiting _queue_task() for managed-node2/shell 44071 1727204712.58648: done queuing things up, now waiting for results queue to drain 44071 1727204712.58651: waiting for pending results... 44071 1727204712.59000: running TaskExecutor() for managed-node2/TASK: Get NM profile info 44071 1727204712.59097: in run() - task 127b8e07-fff9-c964-7471-000000001f1a 44071 1727204712.59123: variable 'ansible_search_path' from source: unknown 44071 1727204712.59128: variable 'ansible_search_path' from source: unknown 44071 1727204712.59204: calling self._execute() 44071 1727204712.59301: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204712.59311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204712.59317: variable 'omit' from source: magic vars 44071 1727204712.59868: variable 'ansible_distribution_major_version' from source: facts 44071 1727204712.59873: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204712.59876: variable 'omit' from source: magic vars 44071 1727204712.59880: variable 'omit' from source: magic vars 44071 1727204712.60073: variable 'profile' from source: play vars 44071 1727204712.60078: variable 'interface' from source: play vars 44071 1727204712.60081: variable 'interface' from source: play vars 44071 1727204712.60083: variable 'omit' from source: magic vars 44071 1727204712.60086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204712.60125: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204712.60152: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204712.60176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204712.60185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204712.60272: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204712.60276: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204712.60279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204712.60341: Set connection var ansible_connection to ssh 44071 1727204712.60348: Set connection var ansible_timeout to 10 44071 1727204712.60355: Set connection var ansible_pipelining to False 44071 1727204712.60361: Set connection var ansible_shell_type to sh 44071 1727204712.60368: Set connection var ansible_shell_executable to /bin/sh 44071 1727204712.60378: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204712.60406: variable 'ansible_shell_executable' from source: unknown 44071 1727204712.60410: variable 'ansible_connection' from source: unknown 44071 1727204712.60569: variable 'ansible_module_compression' from source: unknown 44071 1727204712.60574: variable 'ansible_shell_type' from source: unknown 44071 1727204712.60577: variable 'ansible_shell_executable' from source: unknown 44071 1727204712.60580: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204712.60582: variable 'ansible_pipelining' from source: unknown 44071 1727204712.60585: variable 'ansible_timeout' from source: unknown 44071 1727204712.60588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204712.60591: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204712.60605: variable 'omit' from source: magic vars 44071 1727204712.60610: starting attempt loop 44071 1727204712.60613: running the handler 44071 1727204712.60625: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204712.60653: _low_level_execute_command(): starting 44071 1727204712.60670: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204712.61284: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204712.61306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.61373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204712.61377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204712.61442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204712.63157: stdout chunk (state=3): >>>/root <<< 44071 1727204712.63375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204712.63379: stdout chunk (state=3): >>><<< 44071 1727204712.63385: stderr chunk (state=3): >>><<< 44071 1727204712.63413: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204712.63430: _low_level_execute_command(): starting 44071 1727204712.63440: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204712.6341267-51030-20725268660371 `" && echo ansible-tmp-1727204712.6341267-51030-20725268660371="` echo /root/.ansible/tmp/ansible-tmp-1727204712.6341267-51030-20725268660371 `" ) && sleep 0' 44071 1727204712.64150: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204712.64167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.64183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.64251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204712.64255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204712.64258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204712.64326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204712.66491: stdout chunk (state=3): >>>ansible-tmp-1727204712.6341267-51030-20725268660371=/root/.ansible/tmp/ansible-tmp-1727204712.6341267-51030-20725268660371 <<< 44071 1727204712.66605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204712.66671: stderr chunk (state=3): >>><<< 44071 1727204712.66677: stdout chunk (state=3): >>><<< 44071 1727204712.66695: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204712.6341267-51030-20725268660371=/root/.ansible/tmp/ansible-tmp-1727204712.6341267-51030-20725268660371 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204712.66727: variable 'ansible_module_compression' from source: unknown 44071 1727204712.66781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204712.66818: variable 'ansible_facts' from source: unknown 44071 1727204712.66875: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204712.6341267-51030-20725268660371/AnsiballZ_command.py 44071 1727204712.66993: Sending initial data 44071 1727204712.66997: Sent initial data (155 bytes) 44071 1727204712.67513: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204712.67517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.67519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204712.67521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204712.67524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.67572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204712.67578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204712.67593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204712.67670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204712.69321: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204712.69390: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204712.69461: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpze1tm8gs /root/.ansible/tmp/ansible-tmp-1727204712.6341267-51030-20725268660371/AnsiballZ_command.py <<< 44071 1727204712.69464: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204712.6341267-51030-20725268660371/AnsiballZ_command.py" <<< 44071 1727204712.69527: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpze1tm8gs" to remote "/root/.ansible/tmp/ansible-tmp-1727204712.6341267-51030-20725268660371/AnsiballZ_command.py" <<< 44071 1727204712.69533: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204712.6341267-51030-20725268660371/AnsiballZ_command.py" <<< 44071 1727204712.70186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204712.70264: stderr chunk (state=3): >>><<< 44071 1727204712.70270: stdout chunk (state=3): >>><<< 44071 1727204712.70290: done transferring module to remote 44071 1727204712.70301: _low_level_execute_command(): starting 44071 1727204712.70308: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204712.6341267-51030-20725268660371/ /root/.ansible/tmp/ansible-tmp-1727204712.6341267-51030-20725268660371/AnsiballZ_command.py && sleep 0' 44071 1727204712.70811: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204712.70817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.70824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204712.70827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.70870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204712.70888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204712.70955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204712.72817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204712.72882: stderr chunk (state=3): >>><<< 44071 1727204712.72886: stdout chunk (state=3): >>><<< 44071 1727204712.72903: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204712.72906: _low_level_execute_command(): starting 44071 1727204712.72911: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204712.6341267-51030-20725268660371/AnsiballZ_command.py && sleep 0' 44071 1727204712.73422: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204712.73426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.73430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204712.73432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.73488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204712.73492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204712.73497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204712.73575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204712.91881: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:05:12.899821", "end": "2024-09-24 15:05:12.917193", "delta": "0:00:00.017372", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204712.93500: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. <<< 44071 1727204712.93504: stdout chunk (state=3): >>><<< 44071 1727204712.93507: stderr chunk (state=3): >>><<< 44071 1727204712.93673: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:05:12.899821", "end": "2024-09-24 15:05:12.917193", "delta": "0:00:00.017372", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. 44071 1727204712.93678: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204712.6341267-51030-20725268660371/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204712.93682: _low_level_execute_command(): starting 44071 1727204712.93684: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204712.6341267-51030-20725268660371/ > /dev/null 2>&1 && sleep 0' 44071 1727204712.94341: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204712.94371: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204712.94388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204712.94479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204712.94544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204712.94601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204712.94670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204712.96757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204712.96761: stdout chunk (state=3): >>><<< 44071 1727204712.96764: stderr chunk (state=3): >>><<< 44071 1727204712.96769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204712.96772: handler run complete 44071 1727204712.96774: Evaluated conditional (False): False 44071 1727204712.96776: attempt loop complete, returning result 44071 1727204712.96778: _execute() done 44071 1727204712.96780: dumping result to json 44071 1727204712.96782: done dumping result, returning 44071 1727204712.96784: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [127b8e07-fff9-c964-7471-000000001f1a] 44071 1727204712.96786: sending task result for task 127b8e07-fff9-c964-7471-000000001f1a 44071 1727204712.97045: done sending task result for task 127b8e07-fff9-c964-7471-000000001f1a 44071 1727204712.97049: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.017372", "end": "2024-09-24 15:05:12.917193", "rc": 1, "start": "2024-09-24 15:05:12.899821" } MSG: non-zero return code ...ignoring 44071 1727204712.97126: no more pending results, returning what we have 44071 1727204712.97129: results queue empty 44071 1727204712.97129: checking for any_errors_fatal 44071 1727204712.97136: done checking for any_errors_fatal 44071 1727204712.97137: checking for max_fail_percentage 44071 1727204712.97139: done checking for max_fail_percentage 44071 1727204712.97139: checking to see if all hosts have failed and the running result is not ok 44071 1727204712.97140: done checking to see if all hosts have failed 44071 1727204712.97141: getting the remaining hosts for this loop 44071 1727204712.97142: done getting the remaining hosts for this loop 44071 1727204712.97146: getting the next task for host managed-node2 44071 1727204712.97154: done getting next task for host managed-node2 44071 1727204712.97156: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44071 1727204712.97167: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204712.97173: getting variables 44071 1727204712.97174: in VariableManager get_vars() 44071 1727204712.97212: Calling all_inventory to load vars for managed-node2 44071 1727204712.97216: Calling groups_inventory to load vars for managed-node2 44071 1727204712.97220: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204712.97231: Calling all_plugins_play to load vars for managed-node2 44071 1727204712.97234: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204712.97237: Calling groups_plugins_play to load vars for managed-node2 44071 1727204713.00477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204713.05169: done with get_vars() 44071 1727204713.05220: done getting variables 44071 1727204713.05290: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.471) 0:02:05.369 ***** 44071 1727204713.05326: entering _queue_task() for managed-node2/set_fact 44071 1727204713.06369: worker is 1 (out of 1 available) 44071 1727204713.06386: exiting _queue_task() for managed-node2/set_fact 44071 1727204713.06400: done queuing things up, now waiting for results queue to drain 44071 1727204713.06402: waiting for pending results... 44071 1727204713.06906: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44071 1727204713.06988: in run() - task 127b8e07-fff9-c964-7471-000000001f1b 44071 1727204713.07005: variable 'ansible_search_path' from source: unknown 44071 1727204713.07009: variable 'ansible_search_path' from source: unknown 44071 1727204713.07112: calling self._execute() 44071 1727204713.07170: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204713.07179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204713.07191: variable 'omit' from source: magic vars 44071 1727204713.07642: variable 'ansible_distribution_major_version' from source: facts 44071 1727204713.07656: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204713.07817: variable 'nm_profile_exists' from source: set_fact 44071 1727204713.07831: Evaluated conditional (nm_profile_exists.rc == 0): False 44071 1727204713.07835: when evaluation is False, skipping this task 44071 1727204713.07872: _execute() done 44071 1727204713.07876: dumping result to json 44071 1727204713.07879: done dumping result, returning 44071 1727204713.07882: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-c964-7471-000000001f1b] 44071 1727204713.07884: sending task result for task 127b8e07-fff9-c964-7471-000000001f1b 44071 1727204713.07974: done sending task result for task 127b8e07-fff9-c964-7471-000000001f1b 44071 1727204713.08070: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 44071 1727204713.08124: no more pending results, returning what we have 44071 1727204713.08128: results queue empty 44071 1727204713.08129: checking for any_errors_fatal 44071 1727204713.08138: done checking for any_errors_fatal 44071 1727204713.08139: checking for max_fail_percentage 44071 1727204713.08141: done checking for max_fail_percentage 44071 1727204713.08141: checking to see if all hosts have failed and the running result is not ok 44071 1727204713.08142: done checking to see if all hosts have failed 44071 1727204713.08143: getting the remaining hosts for this loop 44071 1727204713.08144: done getting the remaining hosts for this loop 44071 1727204713.08149: getting the next task for host managed-node2 44071 1727204713.08160: done getting next task for host managed-node2 44071 1727204713.08164: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 44071 1727204713.08171: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204713.08175: getting variables 44071 1727204713.08176: in VariableManager get_vars() 44071 1727204713.08213: Calling all_inventory to load vars for managed-node2 44071 1727204713.08216: Calling groups_inventory to load vars for managed-node2 44071 1727204713.08218: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204713.08230: Calling all_plugins_play to load vars for managed-node2 44071 1727204713.08233: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204713.08236: Calling groups_plugins_play to load vars for managed-node2 44071 1727204713.10194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204713.12471: done with get_vars() 44071 1727204713.12515: done getting variables 44071 1727204713.12590: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204713.12735: variable 'profile' from source: play vars 44071 1727204713.12740: variable 'interface' from source: play vars 44071 1727204713.12814: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.075) 0:02:05.445 ***** 44071 1727204713.12859: entering _queue_task() for managed-node2/command 44071 1727204713.13312: worker is 1 (out of 1 available) 44071 1727204713.13329: exiting _queue_task() for managed-node2/command 44071 1727204713.13350: done queuing things up, now waiting for results queue to drain 44071 1727204713.13352: waiting for pending results... 44071 1727204713.13708: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr 44071 1727204713.13849: in run() - task 127b8e07-fff9-c964-7471-000000001f1d 44071 1727204713.13869: variable 'ansible_search_path' from source: unknown 44071 1727204713.13873: variable 'ansible_search_path' from source: unknown 44071 1727204713.13915: calling self._execute() 44071 1727204713.14036: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204713.14046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204713.14057: variable 'omit' from source: magic vars 44071 1727204713.14506: variable 'ansible_distribution_major_version' from source: facts 44071 1727204713.14519: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204713.14674: variable 'profile_stat' from source: set_fact 44071 1727204713.14687: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204713.14690: when evaluation is False, skipping this task 44071 1727204713.14697: _execute() done 44071 1727204713.14701: dumping result to json 44071 1727204713.14703: done dumping result, returning 44071 1727204713.14707: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr [127b8e07-fff9-c964-7471-000000001f1d] 44071 1727204713.14712: sending task result for task 127b8e07-fff9-c964-7471-000000001f1d 44071 1727204713.14903: done sending task result for task 127b8e07-fff9-c964-7471-000000001f1d 44071 1727204713.14907: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204713.14972: no more pending results, returning what we have 44071 1727204713.14976: results queue empty 44071 1727204713.14977: checking for any_errors_fatal 44071 1727204713.14987: done checking for any_errors_fatal 44071 1727204713.14988: checking for max_fail_percentage 44071 1727204713.14990: done checking for max_fail_percentage 44071 1727204713.14991: checking to see if all hosts have failed and the running result is not ok 44071 1727204713.14992: done checking to see if all hosts have failed 44071 1727204713.14993: getting the remaining hosts for this loop 44071 1727204713.14995: done getting the remaining hosts for this loop 44071 1727204713.15000: getting the next task for host managed-node2 44071 1727204713.15011: done getting next task for host managed-node2 44071 1727204713.15015: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 44071 1727204713.15023: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204713.15029: getting variables 44071 1727204713.15031: in VariableManager get_vars() 44071 1727204713.15085: Calling all_inventory to load vars for managed-node2 44071 1727204713.15088: Calling groups_inventory to load vars for managed-node2 44071 1727204713.15092: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204713.15110: Calling all_plugins_play to load vars for managed-node2 44071 1727204713.15114: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204713.15117: Calling groups_plugins_play to load vars for managed-node2 44071 1727204713.17260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204713.19473: done with get_vars() 44071 1727204713.19518: done getting variables 44071 1727204713.19590: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204713.19716: variable 'profile' from source: play vars 44071 1727204713.19720: variable 'interface' from source: play vars 44071 1727204713.19788: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.069) 0:02:05.514 ***** 44071 1727204713.19824: entering _queue_task() for managed-node2/set_fact 44071 1727204713.20250: worker is 1 (out of 1 available) 44071 1727204713.20470: exiting _queue_task() for managed-node2/set_fact 44071 1727204713.20484: done queuing things up, now waiting for results queue to drain 44071 1727204713.20486: waiting for pending results... 44071 1727204713.20632: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 44071 1727204713.20873: in run() - task 127b8e07-fff9-c964-7471-000000001f1e 44071 1727204713.20878: variable 'ansible_search_path' from source: unknown 44071 1727204713.20882: variable 'ansible_search_path' from source: unknown 44071 1727204713.20888: calling self._execute() 44071 1727204713.21010: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204713.21023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204713.21045: variable 'omit' from source: magic vars 44071 1727204713.21507: variable 'ansible_distribution_major_version' from source: facts 44071 1727204713.21584: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204713.21682: variable 'profile_stat' from source: set_fact 44071 1727204713.21704: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204713.21712: when evaluation is False, skipping this task 44071 1727204713.21718: _execute() done 44071 1727204713.21726: dumping result to json 44071 1727204713.21735: done dumping result, returning 44071 1727204713.21746: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [127b8e07-fff9-c964-7471-000000001f1e] 44071 1727204713.21754: sending task result for task 127b8e07-fff9-c964-7471-000000001f1e skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204713.21928: no more pending results, returning what we have 44071 1727204713.21935: results queue empty 44071 1727204713.21936: checking for any_errors_fatal 44071 1727204713.21947: done checking for any_errors_fatal 44071 1727204713.21948: checking for max_fail_percentage 44071 1727204713.21950: done checking for max_fail_percentage 44071 1727204713.21951: checking to see if all hosts have failed and the running result is not ok 44071 1727204713.21952: done checking to see if all hosts have failed 44071 1727204713.21953: getting the remaining hosts for this loop 44071 1727204713.21955: done getting the remaining hosts for this loop 44071 1727204713.21960: getting the next task for host managed-node2 44071 1727204713.21974: done getting next task for host managed-node2 44071 1727204713.21978: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 44071 1727204713.21986: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204713.21991: getting variables 44071 1727204713.21993: in VariableManager get_vars() 44071 1727204713.22045: Calling all_inventory to load vars for managed-node2 44071 1727204713.22049: Calling groups_inventory to load vars for managed-node2 44071 1727204713.22053: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204713.22372: Calling all_plugins_play to load vars for managed-node2 44071 1727204713.22377: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204713.22382: Calling groups_plugins_play to load vars for managed-node2 44071 1727204713.23086: done sending task result for task 127b8e07-fff9-c964-7471-000000001f1e 44071 1727204713.23091: WORKER PROCESS EXITING 44071 1727204713.24214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204713.26448: done with get_vars() 44071 1727204713.26497: done getting variables 44071 1727204713.26571: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204713.26704: variable 'profile' from source: play vars 44071 1727204713.26708: variable 'interface' from source: play vars 44071 1727204713.26777: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.069) 0:02:05.584 ***** 44071 1727204713.26814: entering _queue_task() for managed-node2/command 44071 1727204713.27243: worker is 1 (out of 1 available) 44071 1727204713.27257: exiting _queue_task() for managed-node2/command 44071 1727204713.27474: done queuing things up, now waiting for results queue to drain 44071 1727204713.27477: waiting for pending results... 44071 1727204713.27619: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr 44071 1727204713.27793: in run() - task 127b8e07-fff9-c964-7471-000000001f1f 44071 1727204713.27821: variable 'ansible_search_path' from source: unknown 44071 1727204713.27830: variable 'ansible_search_path' from source: unknown 44071 1727204713.27882: calling self._execute() 44071 1727204713.27999: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204713.28012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204713.28032: variable 'omit' from source: magic vars 44071 1727204713.28493: variable 'ansible_distribution_major_version' from source: facts 44071 1727204713.28512: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204713.28660: variable 'profile_stat' from source: set_fact 44071 1727204713.28683: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204713.28691: when evaluation is False, skipping this task 44071 1727204713.28697: _execute() done 44071 1727204713.28704: dumping result to json 44071 1727204713.28712: done dumping result, returning 44071 1727204713.28722: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr [127b8e07-fff9-c964-7471-000000001f1f] 44071 1727204713.28732: sending task result for task 127b8e07-fff9-c964-7471-000000001f1f skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204713.28912: no more pending results, returning what we have 44071 1727204713.28917: results queue empty 44071 1727204713.28919: checking for any_errors_fatal 44071 1727204713.28926: done checking for any_errors_fatal 44071 1727204713.28927: checking for max_fail_percentage 44071 1727204713.28928: done checking for max_fail_percentage 44071 1727204713.28930: checking to see if all hosts have failed and the running result is not ok 44071 1727204713.28930: done checking to see if all hosts have failed 44071 1727204713.28931: getting the remaining hosts for this loop 44071 1727204713.28935: done getting the remaining hosts for this loop 44071 1727204713.28941: getting the next task for host managed-node2 44071 1727204713.28952: done getting next task for host managed-node2 44071 1727204713.28955: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 44071 1727204713.28963: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204713.28971: getting variables 44071 1727204713.28973: in VariableManager get_vars() 44071 1727204713.29023: Calling all_inventory to load vars for managed-node2 44071 1727204713.29027: Calling groups_inventory to load vars for managed-node2 44071 1727204713.29031: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204713.29051: Calling all_plugins_play to load vars for managed-node2 44071 1727204713.29054: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204713.29058: Calling groups_plugins_play to load vars for managed-node2 44071 1727204713.30108: done sending task result for task 127b8e07-fff9-c964-7471-000000001f1f 44071 1727204713.30113: WORKER PROCESS EXITING 44071 1727204713.31361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204713.33583: done with get_vars() 44071 1727204713.33628: done getting variables 44071 1727204713.33700: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204713.33829: variable 'profile' from source: play vars 44071 1727204713.33836: variable 'interface' from source: play vars 44071 1727204713.33907: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.071) 0:02:05.655 ***** 44071 1727204713.33947: entering _queue_task() for managed-node2/set_fact 44071 1727204713.34370: worker is 1 (out of 1 available) 44071 1727204713.34388: exiting _queue_task() for managed-node2/set_fact 44071 1727204713.34403: done queuing things up, now waiting for results queue to drain 44071 1727204713.34404: waiting for pending results... 44071 1727204713.34754: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr 44071 1727204713.34927: in run() - task 127b8e07-fff9-c964-7471-000000001f20 44071 1727204713.34954: variable 'ansible_search_path' from source: unknown 44071 1727204713.34964: variable 'ansible_search_path' from source: unknown 44071 1727204713.35015: calling self._execute() 44071 1727204713.35135: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204713.35151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204713.35170: variable 'omit' from source: magic vars 44071 1727204713.35621: variable 'ansible_distribution_major_version' from source: facts 44071 1727204713.35647: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204713.35798: variable 'profile_stat' from source: set_fact 44071 1727204713.35817: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204713.35824: when evaluation is False, skipping this task 44071 1727204713.35831: _execute() done 44071 1727204713.35842: dumping result to json 44071 1727204713.35853: done dumping result, returning 44071 1727204713.35863: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr [127b8e07-fff9-c964-7471-000000001f20] 44071 1727204713.35875: sending task result for task 127b8e07-fff9-c964-7471-000000001f20 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204713.36052: no more pending results, returning what we have 44071 1727204713.36057: results queue empty 44071 1727204713.36059: checking for any_errors_fatal 44071 1727204713.36069: done checking for any_errors_fatal 44071 1727204713.36070: checking for max_fail_percentage 44071 1727204713.36071: done checking for max_fail_percentage 44071 1727204713.36072: checking to see if all hosts have failed and the running result is not ok 44071 1727204713.36073: done checking to see if all hosts have failed 44071 1727204713.36074: getting the remaining hosts for this loop 44071 1727204713.36076: done getting the remaining hosts for this loop 44071 1727204713.36081: getting the next task for host managed-node2 44071 1727204713.36094: done getting next task for host managed-node2 44071 1727204713.36098: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 44071 1727204713.36103: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204713.36109: getting variables 44071 1727204713.36110: in VariableManager get_vars() 44071 1727204713.36162: Calling all_inventory to load vars for managed-node2 44071 1727204713.36270: Calling groups_inventory to load vars for managed-node2 44071 1727204713.36275: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204713.36293: Calling all_plugins_play to load vars for managed-node2 44071 1727204713.36297: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204713.36300: Calling groups_plugins_play to load vars for managed-node2 44071 1727204713.37007: done sending task result for task 127b8e07-fff9-c964-7471-000000001f20 44071 1727204713.37011: WORKER PROCESS EXITING 44071 1727204713.38481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204713.40732: done with get_vars() 44071 1727204713.40784: done getting variables 44071 1727204713.40856: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204713.40990: variable 'profile' from source: play vars 44071 1727204713.40995: variable 'interface' from source: play vars 44071 1727204713.41060: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.071) 0:02:05.727 ***** 44071 1727204713.41098: entering _queue_task() for managed-node2/assert 44071 1727204713.41530: worker is 1 (out of 1 available) 44071 1727204713.41546: exiting _queue_task() for managed-node2/assert 44071 1727204713.41564: done queuing things up, now waiting for results queue to drain 44071 1727204713.41768: waiting for pending results... 44071 1727204713.41907: running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'statebr' 44071 1727204713.42058: in run() - task 127b8e07-fff9-c964-7471-000000001e9a 44071 1727204713.42086: variable 'ansible_search_path' from source: unknown 44071 1727204713.42094: variable 'ansible_search_path' from source: unknown 44071 1727204713.42149: calling self._execute() 44071 1727204713.42278: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204713.42291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204713.42307: variable 'omit' from source: magic vars 44071 1727204713.42773: variable 'ansible_distribution_major_version' from source: facts 44071 1727204713.42793: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204713.42804: variable 'omit' from source: magic vars 44071 1727204713.42874: variable 'omit' from source: magic vars 44071 1727204713.42996: variable 'profile' from source: play vars 44071 1727204713.43008: variable 'interface' from source: play vars 44071 1727204713.43087: variable 'interface' from source: play vars 44071 1727204713.43114: variable 'omit' from source: magic vars 44071 1727204713.43191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204713.43220: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204713.43252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204713.43280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204713.43371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204713.43374: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204713.43377: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204713.43379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204713.43483: Set connection var ansible_connection to ssh 44071 1727204713.43497: Set connection var ansible_timeout to 10 44071 1727204713.43511: Set connection var ansible_pipelining to False 44071 1727204713.43525: Set connection var ansible_shell_type to sh 44071 1727204713.43537: Set connection var ansible_shell_executable to /bin/sh 44071 1727204713.43551: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204713.43585: variable 'ansible_shell_executable' from source: unknown 44071 1727204713.43593: variable 'ansible_connection' from source: unknown 44071 1727204713.43601: variable 'ansible_module_compression' from source: unknown 44071 1727204713.43608: variable 'ansible_shell_type' from source: unknown 44071 1727204713.43616: variable 'ansible_shell_executable' from source: unknown 44071 1727204713.43736: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204713.43740: variable 'ansible_pipelining' from source: unknown 44071 1727204713.43744: variable 'ansible_timeout' from source: unknown 44071 1727204713.43746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204713.43823: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204713.43849: variable 'omit' from source: magic vars 44071 1727204713.43860: starting attempt loop 44071 1727204713.43870: running the handler 44071 1727204713.44024: variable 'lsr_net_profile_exists' from source: set_fact 44071 1727204713.44038: Evaluated conditional (not lsr_net_profile_exists): True 44071 1727204713.44050: handler run complete 44071 1727204713.44078: attempt loop complete, returning result 44071 1727204713.44168: _execute() done 44071 1727204713.44174: dumping result to json 44071 1727204713.44177: done dumping result, returning 44071 1727204713.44180: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'statebr' [127b8e07-fff9-c964-7471-000000001e9a] 44071 1727204713.44182: sending task result for task 127b8e07-fff9-c964-7471-000000001e9a 44071 1727204713.44262: done sending task result for task 127b8e07-fff9-c964-7471-000000001e9a 44071 1727204713.44368: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204713.44431: no more pending results, returning what we have 44071 1727204713.44438: results queue empty 44071 1727204713.44439: checking for any_errors_fatal 44071 1727204713.44450: done checking for any_errors_fatal 44071 1727204713.44451: checking for max_fail_percentage 44071 1727204713.44453: done checking for max_fail_percentage 44071 1727204713.44454: checking to see if all hosts have failed and the running result is not ok 44071 1727204713.44455: done checking to see if all hosts have failed 44071 1727204713.44456: getting the remaining hosts for this loop 44071 1727204713.44458: done getting the remaining hosts for this loop 44071 1727204713.44463: getting the next task for host managed-node2 44071 1727204713.44475: done getting next task for host managed-node2 44071 1727204713.44480: ^ task is: TASK: Conditional asserts 44071 1727204713.44484: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204713.44490: getting variables 44071 1727204713.44492: in VariableManager get_vars() 44071 1727204713.44544: Calling all_inventory to load vars for managed-node2 44071 1727204713.44548: Calling groups_inventory to load vars for managed-node2 44071 1727204713.44554: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204713.44772: Calling all_plugins_play to load vars for managed-node2 44071 1727204713.44776: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204713.44781: Calling groups_plugins_play to load vars for managed-node2 44071 1727204713.46656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204713.48938: done with get_vars() 44071 1727204713.48982: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.079) 0:02:05.807 ***** 44071 1727204713.49092: entering _queue_task() for managed-node2/include_tasks 44071 1727204713.49557: worker is 1 (out of 1 available) 44071 1727204713.49776: exiting _queue_task() for managed-node2/include_tasks 44071 1727204713.49790: done queuing things up, now waiting for results queue to drain 44071 1727204713.49792: waiting for pending results... 44071 1727204713.49942: running TaskExecutor() for managed-node2/TASK: Conditional asserts 44071 1727204713.50100: in run() - task 127b8e07-fff9-c964-7471-00000000174a 44071 1727204713.50241: variable 'ansible_search_path' from source: unknown 44071 1727204713.50245: variable 'ansible_search_path' from source: unknown 44071 1727204713.50504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204713.52846: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204713.52903: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204713.52932: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204713.52963: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204713.52993: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204713.53112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204713.53116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204713.53240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204713.53243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204713.53246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204713.53389: variable 'lsr_assert_when' from source: include params 44071 1727204713.53443: variable 'network_provider' from source: set_fact 44071 1727204713.53557: variable 'omit' from source: magic vars 44071 1727204713.53695: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204713.53737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204713.53741: variable 'omit' from source: magic vars 44071 1727204713.53991: variable 'ansible_distribution_major_version' from source: facts 44071 1727204713.54007: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204713.54221: variable 'item' from source: unknown 44071 1727204713.54227: Evaluated conditional (item['condition']): True 44071 1727204713.54349: variable 'item' from source: unknown 44071 1727204713.54393: variable 'item' from source: unknown 44071 1727204713.54453: variable 'item' from source: unknown 44071 1727204713.54624: dumping result to json 44071 1727204713.54628: done dumping result, returning 44071 1727204713.54630: done running TaskExecutor() for managed-node2/TASK: Conditional asserts [127b8e07-fff9-c964-7471-00000000174a] 44071 1727204713.54632: sending task result for task 127b8e07-fff9-c964-7471-00000000174a 44071 1727204713.54712: no more pending results, returning what we have 44071 1727204713.54717: in VariableManager get_vars() 44071 1727204713.54777: Calling all_inventory to load vars for managed-node2 44071 1727204713.54780: Calling groups_inventory to load vars for managed-node2 44071 1727204713.54783: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204713.54789: done sending task result for task 127b8e07-fff9-c964-7471-00000000174a 44071 1727204713.54792: WORKER PROCESS EXITING 44071 1727204713.54803: Calling all_plugins_play to load vars for managed-node2 44071 1727204713.54805: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204713.54808: Calling groups_plugins_play to load vars for managed-node2 44071 1727204713.56001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204713.57791: done with get_vars() 44071 1727204713.57827: variable 'ansible_search_path' from source: unknown 44071 1727204713.57829: variable 'ansible_search_path' from source: unknown 44071 1727204713.57882: we have included files to process 44071 1727204713.57883: generating all_blocks data 44071 1727204713.57885: done generating all_blocks data 44071 1727204713.57891: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44071 1727204713.57892: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44071 1727204713.57895: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44071 1727204713.58026: in VariableManager get_vars() 44071 1727204713.58053: done with get_vars() 44071 1727204713.58188: done processing included file 44071 1727204713.58190: iterating over new_blocks loaded from include file 44071 1727204713.58191: in VariableManager get_vars() 44071 1727204713.58210: done with get_vars() 44071 1727204713.58212: filtering new block on tags 44071 1727204713.58253: done filtering new block on tags 44071 1727204713.58256: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 44071 1727204713.58262: extending task lists for all hosts with included blocks 44071 1727204713.59078: done extending task lists 44071 1727204713.59079: done processing included files 44071 1727204713.59080: results queue empty 44071 1727204713.59080: checking for any_errors_fatal 44071 1727204713.59084: done checking for any_errors_fatal 44071 1727204713.59085: checking for max_fail_percentage 44071 1727204713.59086: done checking for max_fail_percentage 44071 1727204713.59087: checking to see if all hosts have failed and the running result is not ok 44071 1727204713.59087: done checking to see if all hosts have failed 44071 1727204713.59088: getting the remaining hosts for this loop 44071 1727204713.59089: done getting the remaining hosts for this loop 44071 1727204713.59091: getting the next task for host managed-node2 44071 1727204713.59094: done getting next task for host managed-node2 44071 1727204713.59096: ^ task is: TASK: Include the task 'get_interface_stat.yml' 44071 1727204713.59099: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204713.59106: getting variables 44071 1727204713.59107: in VariableManager get_vars() 44071 1727204713.59120: Calling all_inventory to load vars for managed-node2 44071 1727204713.59122: Calling groups_inventory to load vars for managed-node2 44071 1727204713.59124: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204713.59129: Calling all_plugins_play to load vars for managed-node2 44071 1727204713.59131: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204713.59135: Calling groups_plugins_play to load vars for managed-node2 44071 1727204713.60215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204713.62291: done with get_vars() 44071 1727204713.62341: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.133) 0:02:05.940 ***** 44071 1727204713.62408: entering _queue_task() for managed-node2/include_tasks 44071 1727204713.62723: worker is 1 (out of 1 available) 44071 1727204713.62743: exiting _queue_task() for managed-node2/include_tasks 44071 1727204713.62758: done queuing things up, now waiting for results queue to drain 44071 1727204713.62760: waiting for pending results... 44071 1727204713.62960: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 44071 1727204713.63072: in run() - task 127b8e07-fff9-c964-7471-000000001f59 44071 1727204713.63088: variable 'ansible_search_path' from source: unknown 44071 1727204713.63094: variable 'ansible_search_path' from source: unknown 44071 1727204713.63127: calling self._execute() 44071 1727204713.63217: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204713.63224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204713.63234: variable 'omit' from source: magic vars 44071 1727204713.63562: variable 'ansible_distribution_major_version' from source: facts 44071 1727204713.63574: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204713.63580: _execute() done 44071 1727204713.63584: dumping result to json 44071 1727204713.63588: done dumping result, returning 44071 1727204713.63594: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-c964-7471-000000001f59] 44071 1727204713.63600: sending task result for task 127b8e07-fff9-c964-7471-000000001f59 44071 1727204713.63704: done sending task result for task 127b8e07-fff9-c964-7471-000000001f59 44071 1727204713.63707: WORKER PROCESS EXITING 44071 1727204713.63741: no more pending results, returning what we have 44071 1727204713.63746: in VariableManager get_vars() 44071 1727204713.63801: Calling all_inventory to load vars for managed-node2 44071 1727204713.63804: Calling groups_inventory to load vars for managed-node2 44071 1727204713.63808: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204713.63825: Calling all_plugins_play to load vars for managed-node2 44071 1727204713.63828: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204713.63831: Calling groups_plugins_play to load vars for managed-node2 44071 1727204713.65511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204713.66859: done with get_vars() 44071 1727204713.66881: variable 'ansible_search_path' from source: unknown 44071 1727204713.66883: variable 'ansible_search_path' from source: unknown 44071 1727204713.66999: variable 'item' from source: include params 44071 1727204713.67035: we have included files to process 44071 1727204713.67036: generating all_blocks data 44071 1727204713.67037: done generating all_blocks data 44071 1727204713.67039: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204713.67040: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204713.67042: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204713.67191: done processing included file 44071 1727204713.67193: iterating over new_blocks loaded from include file 44071 1727204713.67194: in VariableManager get_vars() 44071 1727204713.67209: done with get_vars() 44071 1727204713.67210: filtering new block on tags 44071 1727204713.67229: done filtering new block on tags 44071 1727204713.67230: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 44071 1727204713.67236: extending task lists for all hosts with included blocks 44071 1727204713.67347: done extending task lists 44071 1727204713.67348: done processing included files 44071 1727204713.67349: results queue empty 44071 1727204713.67350: checking for any_errors_fatal 44071 1727204713.67354: done checking for any_errors_fatal 44071 1727204713.67355: checking for max_fail_percentage 44071 1727204713.67355: done checking for max_fail_percentage 44071 1727204713.67356: checking to see if all hosts have failed and the running result is not ok 44071 1727204713.67357: done checking to see if all hosts have failed 44071 1727204713.67357: getting the remaining hosts for this loop 44071 1727204713.67358: done getting the remaining hosts for this loop 44071 1727204713.67360: getting the next task for host managed-node2 44071 1727204713.67364: done getting next task for host managed-node2 44071 1727204713.67367: ^ task is: TASK: Get stat for interface {{ interface }} 44071 1727204713.67370: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204713.67371: getting variables 44071 1727204713.67372: in VariableManager get_vars() 44071 1727204713.67381: Calling all_inventory to load vars for managed-node2 44071 1727204713.67383: Calling groups_inventory to load vars for managed-node2 44071 1727204713.67384: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204713.67389: Calling all_plugins_play to load vars for managed-node2 44071 1727204713.67390: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204713.67392: Calling groups_plugins_play to load vars for managed-node2 44071 1727204713.68268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204713.69476: done with get_vars() 44071 1727204713.69505: done getting variables 44071 1727204713.69616: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:05:13 -0400 (0:00:00.072) 0:02:06.012 ***** 44071 1727204713.69646: entering _queue_task() for managed-node2/stat 44071 1727204713.69960: worker is 1 (out of 1 available) 44071 1727204713.69979: exiting _queue_task() for managed-node2/stat 44071 1727204713.69995: done queuing things up, now waiting for results queue to drain 44071 1727204713.69997: waiting for pending results... 44071 1727204713.70213: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 44071 1727204713.70308: in run() - task 127b8e07-fff9-c964-7471-000000001fe8 44071 1727204713.70322: variable 'ansible_search_path' from source: unknown 44071 1727204713.70325: variable 'ansible_search_path' from source: unknown 44071 1727204713.70362: calling self._execute() 44071 1727204713.70442: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204713.70456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204713.70464: variable 'omit' from source: magic vars 44071 1727204713.70793: variable 'ansible_distribution_major_version' from source: facts 44071 1727204713.70804: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204713.70810: variable 'omit' from source: magic vars 44071 1727204713.70855: variable 'omit' from source: magic vars 44071 1727204713.70933: variable 'interface' from source: play vars 44071 1727204713.70950: variable 'omit' from source: magic vars 44071 1727204713.70994: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204713.71021: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204713.71043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204713.71058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204713.71070: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204713.71097: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204713.71100: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204713.71103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204713.71183: Set connection var ansible_connection to ssh 44071 1727204713.71189: Set connection var ansible_timeout to 10 44071 1727204713.71195: Set connection var ansible_pipelining to False 44071 1727204713.71200: Set connection var ansible_shell_type to sh 44071 1727204713.71206: Set connection var ansible_shell_executable to /bin/sh 44071 1727204713.71214: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204713.71236: variable 'ansible_shell_executable' from source: unknown 44071 1727204713.71240: variable 'ansible_connection' from source: unknown 44071 1727204713.71243: variable 'ansible_module_compression' from source: unknown 44071 1727204713.71245: variable 'ansible_shell_type' from source: unknown 44071 1727204713.71248: variable 'ansible_shell_executable' from source: unknown 44071 1727204713.71251: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204713.71256: variable 'ansible_pipelining' from source: unknown 44071 1727204713.71259: variable 'ansible_timeout' from source: unknown 44071 1727204713.71263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204713.71437: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204713.71450: variable 'omit' from source: magic vars 44071 1727204713.71455: starting attempt loop 44071 1727204713.71457: running the handler 44071 1727204713.71474: _low_level_execute_command(): starting 44071 1727204713.71481: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204713.72057: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204713.72061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204713.72064: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204713.72070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204713.72124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204713.72127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204713.72214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204713.73978: stdout chunk (state=3): >>>/root <<< 44071 1727204713.74089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204713.74155: stderr chunk (state=3): >>><<< 44071 1727204713.74159: stdout chunk (state=3): >>><<< 44071 1727204713.74185: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204713.74198: _low_level_execute_command(): starting 44071 1727204713.74204: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204713.741848-51067-88258226862808 `" && echo ansible-tmp-1727204713.741848-51067-88258226862808="` echo /root/.ansible/tmp/ansible-tmp-1727204713.741848-51067-88258226862808 `" ) && sleep 0' 44071 1727204713.74714: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204713.74718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204713.74721: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204713.74731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204713.74780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204713.74784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204713.74863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204713.76845: stdout chunk (state=3): >>>ansible-tmp-1727204713.741848-51067-88258226862808=/root/.ansible/tmp/ansible-tmp-1727204713.741848-51067-88258226862808 <<< 44071 1727204713.76955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204713.77020: stderr chunk (state=3): >>><<< 44071 1727204713.77023: stdout chunk (state=3): >>><<< 44071 1727204713.77041: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204713.741848-51067-88258226862808=/root/.ansible/tmp/ansible-tmp-1727204713.741848-51067-88258226862808 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204713.77093: variable 'ansible_module_compression' from source: unknown 44071 1727204713.77140: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44071 1727204713.77179: variable 'ansible_facts' from source: unknown 44071 1727204713.77244: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204713.741848-51067-88258226862808/AnsiballZ_stat.py 44071 1727204713.77364: Sending initial data 44071 1727204713.77371: Sent initial data (151 bytes) 44071 1727204713.77861: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204713.77888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204713.77891: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204713.77894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204713.77952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204713.77955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204713.78030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204713.79629: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204713.79698: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204713.79767: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpu3zfy1av /root/.ansible/tmp/ansible-tmp-1727204713.741848-51067-88258226862808/AnsiballZ_stat.py <<< 44071 1727204713.79771: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204713.741848-51067-88258226862808/AnsiballZ_stat.py" <<< 44071 1727204713.79834: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpu3zfy1av" to remote "/root/.ansible/tmp/ansible-tmp-1727204713.741848-51067-88258226862808/AnsiballZ_stat.py" <<< 44071 1727204713.79841: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204713.741848-51067-88258226862808/AnsiballZ_stat.py" <<< 44071 1727204713.80492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204713.80572: stderr chunk (state=3): >>><<< 44071 1727204713.80576: stdout chunk (state=3): >>><<< 44071 1727204713.80600: done transferring module to remote 44071 1727204713.80611: _low_level_execute_command(): starting 44071 1727204713.80616: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204713.741848-51067-88258226862808/ /root/.ansible/tmp/ansible-tmp-1727204713.741848-51067-88258226862808/AnsiballZ_stat.py && sleep 0' 44071 1727204713.81118: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204713.81122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204713.81125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204713.81127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204713.81129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204713.81179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204713.81182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204713.81260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204713.83077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204713.83141: stderr chunk (state=3): >>><<< 44071 1727204713.83146: stdout chunk (state=3): >>><<< 44071 1727204713.83159: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204713.83163: _low_level_execute_command(): starting 44071 1727204713.83168: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204713.741848-51067-88258226862808/AnsiballZ_stat.py && sleep 0' 44071 1727204713.83654: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204713.83658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204713.83691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204713.83694: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204713.83697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204713.83699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204713.83760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204713.83763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204713.83772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204713.83845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204714.00550: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44071 1727204714.01838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204714.01842: stdout chunk (state=3): >>><<< 44071 1727204714.01845: stderr chunk (state=3): >>><<< 44071 1727204714.01995: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204714.02000: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204713.741848-51067-88258226862808/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204714.02002: _low_level_execute_command(): starting 44071 1727204714.02005: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204713.741848-51067-88258226862808/ > /dev/null 2>&1 && sleep 0' 44071 1727204714.02643: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204714.02681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204714.02787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204714.02822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204714.02940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204714.05179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204714.05184: stdout chunk (state=3): >>><<< 44071 1727204714.05186: stderr chunk (state=3): >>><<< 44071 1727204714.05189: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204714.05192: handler run complete 44071 1727204714.05195: attempt loop complete, returning result 44071 1727204714.05197: _execute() done 44071 1727204714.05199: dumping result to json 44071 1727204714.05201: done dumping result, returning 44071 1727204714.05203: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [127b8e07-fff9-c964-7471-000000001fe8] 44071 1727204714.05205: sending task result for task 127b8e07-fff9-c964-7471-000000001fe8 44071 1727204714.05368: done sending task result for task 127b8e07-fff9-c964-7471-000000001fe8 44071 1727204714.05372: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 44071 1727204714.05462: no more pending results, returning what we have 44071 1727204714.05467: results queue empty 44071 1727204714.05468: checking for any_errors_fatal 44071 1727204714.05470: done checking for any_errors_fatal 44071 1727204714.05471: checking for max_fail_percentage 44071 1727204714.05473: done checking for max_fail_percentage 44071 1727204714.05474: checking to see if all hosts have failed and the running result is not ok 44071 1727204714.05475: done checking to see if all hosts have failed 44071 1727204714.05476: getting the remaining hosts for this loop 44071 1727204714.05478: done getting the remaining hosts for this loop 44071 1727204714.05483: getting the next task for host managed-node2 44071 1727204714.05671: done getting next task for host managed-node2 44071 1727204714.05676: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 44071 1727204714.05681: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204714.05688: getting variables 44071 1727204714.05689: in VariableManager get_vars() 44071 1727204714.05740: Calling all_inventory to load vars for managed-node2 44071 1727204714.05743: Calling groups_inventory to load vars for managed-node2 44071 1727204714.05747: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204714.05760: Calling all_plugins_play to load vars for managed-node2 44071 1727204714.05763: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204714.05784: Calling groups_plugins_play to load vars for managed-node2 44071 1727204714.07783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204714.10352: done with get_vars() 44071 1727204714.10407: done getting variables 44071 1727204714.10489: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204714.10641: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:05:14 -0400 (0:00:00.410) 0:02:06.423 ***** 44071 1727204714.10682: entering _queue_task() for managed-node2/assert 44071 1727204714.11129: worker is 1 (out of 1 available) 44071 1727204714.11260: exiting _queue_task() for managed-node2/assert 44071 1727204714.11274: done queuing things up, now waiting for results queue to drain 44071 1727204714.11277: waiting for pending results... 44071 1727204714.11717: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' 44071 1727204714.11723: in run() - task 127b8e07-fff9-c964-7471-000000001f5a 44071 1727204714.11726: variable 'ansible_search_path' from source: unknown 44071 1727204714.11729: variable 'ansible_search_path' from source: unknown 44071 1727204714.11785: calling self._execute() 44071 1727204714.11922: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204714.11938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204714.12030: variable 'omit' from source: magic vars 44071 1727204714.12610: variable 'ansible_distribution_major_version' from source: facts 44071 1727204714.12691: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204714.12703: variable 'omit' from source: magic vars 44071 1727204714.12772: variable 'omit' from source: magic vars 44071 1727204714.12941: variable 'interface' from source: play vars 44071 1727204714.12973: variable 'omit' from source: magic vars 44071 1727204714.13039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204714.13092: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204714.13127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204714.13159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204714.13179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204714.13261: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204714.13266: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204714.13271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204714.13387: Set connection var ansible_connection to ssh 44071 1727204714.13390: Set connection var ansible_timeout to 10 44071 1727204714.13393: Set connection var ansible_pipelining to False 44071 1727204714.13446: Set connection var ansible_shell_type to sh 44071 1727204714.13450: Set connection var ansible_shell_executable to /bin/sh 44071 1727204714.13453: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204714.13469: variable 'ansible_shell_executable' from source: unknown 44071 1727204714.13481: variable 'ansible_connection' from source: unknown 44071 1727204714.13496: variable 'ansible_module_compression' from source: unknown 44071 1727204714.13505: variable 'ansible_shell_type' from source: unknown 44071 1727204714.13514: variable 'ansible_shell_executable' from source: unknown 44071 1727204714.13523: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204714.13554: variable 'ansible_pipelining' from source: unknown 44071 1727204714.13557: variable 'ansible_timeout' from source: unknown 44071 1727204714.13562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204714.13718: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204714.13772: variable 'omit' from source: magic vars 44071 1727204714.13778: starting attempt loop 44071 1727204714.13781: running the handler 44071 1727204714.13977: variable 'interface_stat' from source: set_fact 44071 1727204714.14000: Evaluated conditional (not interface_stat.stat.exists): True 44071 1727204714.14012: handler run complete 44071 1727204714.14101: attempt loop complete, returning result 44071 1727204714.14105: _execute() done 44071 1727204714.14107: dumping result to json 44071 1727204714.14110: done dumping result, returning 44071 1727204714.14112: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' [127b8e07-fff9-c964-7471-000000001f5a] 44071 1727204714.14115: sending task result for task 127b8e07-fff9-c964-7471-000000001f5a ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204714.14283: no more pending results, returning what we have 44071 1727204714.14288: results queue empty 44071 1727204714.14289: checking for any_errors_fatal 44071 1727204714.14300: done checking for any_errors_fatal 44071 1727204714.14301: checking for max_fail_percentage 44071 1727204714.14302: done checking for max_fail_percentage 44071 1727204714.14304: checking to see if all hosts have failed and the running result is not ok 44071 1727204714.14304: done checking to see if all hosts have failed 44071 1727204714.14305: getting the remaining hosts for this loop 44071 1727204714.14307: done getting the remaining hosts for this loop 44071 1727204714.14313: getting the next task for host managed-node2 44071 1727204714.14327: done getting next task for host managed-node2 44071 1727204714.14330: ^ task is: TASK: Success in test '{{ lsr_description }}' 44071 1727204714.14337: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204714.14344: getting variables 44071 1727204714.14346: in VariableManager get_vars() 44071 1727204714.14406: Calling all_inventory to load vars for managed-node2 44071 1727204714.14409: Calling groups_inventory to load vars for managed-node2 44071 1727204714.14414: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204714.14430: Calling all_plugins_play to load vars for managed-node2 44071 1727204714.14436: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204714.14440: Calling groups_plugins_play to load vars for managed-node2 44071 1727204714.15236: done sending task result for task 127b8e07-fff9-c964-7471-000000001f5a 44071 1727204714.15241: WORKER PROCESS EXITING 44071 1727204714.17099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204714.25437: done with get_vars() 44071 1727204714.25467: done getting variables 44071 1727204714.25507: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204714.25589: variable 'lsr_description' from source: include params TASK [Success in test 'I can take a profile down that is absent'] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 15:05:14 -0400 (0:00:00.149) 0:02:06.572 ***** 44071 1727204714.25610: entering _queue_task() for managed-node2/debug 44071 1727204714.25922: worker is 1 (out of 1 available) 44071 1727204714.25939: exiting _queue_task() for managed-node2/debug 44071 1727204714.25954: done queuing things up, now waiting for results queue to drain 44071 1727204714.25956: waiting for pending results... 44071 1727204714.26172: running TaskExecutor() for managed-node2/TASK: Success in test 'I can take a profile down that is absent' 44071 1727204714.26265: in run() - task 127b8e07-fff9-c964-7471-00000000174b 44071 1727204714.26279: variable 'ansible_search_path' from source: unknown 44071 1727204714.26286: variable 'ansible_search_path' from source: unknown 44071 1727204714.26323: calling self._execute() 44071 1727204714.26416: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204714.26422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204714.26429: variable 'omit' from source: magic vars 44071 1727204714.26767: variable 'ansible_distribution_major_version' from source: facts 44071 1727204714.26779: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204714.26786: variable 'omit' from source: magic vars 44071 1727204714.26817: variable 'omit' from source: magic vars 44071 1727204714.26930: variable 'lsr_description' from source: include params 44071 1727204714.26953: variable 'omit' from source: magic vars 44071 1727204714.27002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204714.27055: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204714.27064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204714.27099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204714.27106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204714.27174: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204714.27183: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204714.27187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204714.27270: Set connection var ansible_connection to ssh 44071 1727204714.27470: Set connection var ansible_timeout to 10 44071 1727204714.27474: Set connection var ansible_pipelining to False 44071 1727204714.27476: Set connection var ansible_shell_type to sh 44071 1727204714.27479: Set connection var ansible_shell_executable to /bin/sh 44071 1727204714.27485: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204714.27488: variable 'ansible_shell_executable' from source: unknown 44071 1727204714.27490: variable 'ansible_connection' from source: unknown 44071 1727204714.27492: variable 'ansible_module_compression' from source: unknown 44071 1727204714.27494: variable 'ansible_shell_type' from source: unknown 44071 1727204714.27496: variable 'ansible_shell_executable' from source: unknown 44071 1727204714.27498: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204714.27500: variable 'ansible_pipelining' from source: unknown 44071 1727204714.27503: variable 'ansible_timeout' from source: unknown 44071 1727204714.27505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204714.27573: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204714.27598: variable 'omit' from source: magic vars 44071 1727204714.27607: starting attempt loop 44071 1727204714.27613: running the handler 44071 1727204714.27677: handler run complete 44071 1727204714.27700: attempt loop complete, returning result 44071 1727204714.27713: _execute() done 44071 1727204714.27722: dumping result to json 44071 1727204714.27729: done dumping result, returning 44071 1727204714.27745: done running TaskExecutor() for managed-node2/TASK: Success in test 'I can take a profile down that is absent' [127b8e07-fff9-c964-7471-00000000174b] 44071 1727204714.27763: sending task result for task 127b8e07-fff9-c964-7471-00000000174b ok: [managed-node2] => {} MSG: +++++ Success in test 'I can take a profile down that is absent' +++++ 44071 1727204714.27946: no more pending results, returning what we have 44071 1727204714.27950: results queue empty 44071 1727204714.27951: checking for any_errors_fatal 44071 1727204714.27962: done checking for any_errors_fatal 44071 1727204714.27962: checking for max_fail_percentage 44071 1727204714.27964: done checking for max_fail_percentage 44071 1727204714.27989: checking to see if all hosts have failed and the running result is not ok 44071 1727204714.27991: done checking to see if all hosts have failed 44071 1727204714.27991: getting the remaining hosts for this loop 44071 1727204714.27994: done getting the remaining hosts for this loop 44071 1727204714.27999: getting the next task for host managed-node2 44071 1727204714.28007: done getting next task for host managed-node2 44071 1727204714.28011: ^ task is: TASK: Cleanup 44071 1727204714.28013: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204714.28018: getting variables 44071 1727204714.28020: in VariableManager get_vars() 44071 1727204714.28093: Calling all_inventory to load vars for managed-node2 44071 1727204714.28097: Calling groups_inventory to load vars for managed-node2 44071 1727204714.28101: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204714.28107: done sending task result for task 127b8e07-fff9-c964-7471-00000000174b 44071 1727204714.28110: WORKER PROCESS EXITING 44071 1727204714.28122: Calling all_plugins_play to load vars for managed-node2 44071 1727204714.28125: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204714.28128: Calling groups_plugins_play to load vars for managed-node2 44071 1727204714.29729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204714.30954: done with get_vars() 44071 1727204714.30986: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 15:05:14 -0400 (0:00:00.054) 0:02:06.627 ***** 44071 1727204714.31070: entering _queue_task() for managed-node2/include_tasks 44071 1727204714.31382: worker is 1 (out of 1 available) 44071 1727204714.31399: exiting _queue_task() for managed-node2/include_tasks 44071 1727204714.31413: done queuing things up, now waiting for results queue to drain 44071 1727204714.31415: waiting for pending results... 44071 1727204714.31751: running TaskExecutor() for managed-node2/TASK: Cleanup 44071 1727204714.31973: in run() - task 127b8e07-fff9-c964-7471-00000000174f 44071 1727204714.31979: variable 'ansible_search_path' from source: unknown 44071 1727204714.31982: variable 'ansible_search_path' from source: unknown 44071 1727204714.31985: variable 'lsr_cleanup' from source: include params 44071 1727204714.32132: variable 'lsr_cleanup' from source: include params 44071 1727204714.32216: variable 'omit' from source: magic vars 44071 1727204714.32377: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204714.32435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204714.32438: variable 'omit' from source: magic vars 44071 1727204714.32699: variable 'ansible_distribution_major_version' from source: facts 44071 1727204714.32712: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204714.32719: variable 'item' from source: unknown 44071 1727204714.32801: variable 'item' from source: unknown 44071 1727204714.32836: variable 'item' from source: unknown 44071 1727204714.32898: variable 'item' from source: unknown 44071 1727204714.33043: dumping result to json 44071 1727204714.33046: done dumping result, returning 44071 1727204714.33048: done running TaskExecutor() for managed-node2/TASK: Cleanup [127b8e07-fff9-c964-7471-00000000174f] 44071 1727204714.33050: sending task result for task 127b8e07-fff9-c964-7471-00000000174f 44071 1727204714.33122: no more pending results, returning what we have 44071 1727204714.33128: in VariableManager get_vars() 44071 1727204714.33181: Calling all_inventory to load vars for managed-node2 44071 1727204714.33184: Calling groups_inventory to load vars for managed-node2 44071 1727204714.33195: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204714.33201: done sending task result for task 127b8e07-fff9-c964-7471-00000000174f 44071 1727204714.33203: WORKER PROCESS EXITING 44071 1727204714.33229: Calling all_plugins_play to load vars for managed-node2 44071 1727204714.33232: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204714.33236: Calling groups_plugins_play to load vars for managed-node2 44071 1727204714.34462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204714.35675: done with get_vars() 44071 1727204714.35703: variable 'ansible_search_path' from source: unknown 44071 1727204714.35705: variable 'ansible_search_path' from source: unknown 44071 1727204714.35740: we have included files to process 44071 1727204714.35741: generating all_blocks data 44071 1727204714.35742: done generating all_blocks data 44071 1727204714.35746: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204714.35747: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204714.35749: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204714.35907: done processing included file 44071 1727204714.35909: iterating over new_blocks loaded from include file 44071 1727204714.35911: in VariableManager get_vars() 44071 1727204714.35927: done with get_vars() 44071 1727204714.35928: filtering new block on tags 44071 1727204714.35947: done filtering new block on tags 44071 1727204714.35949: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed-node2 => (item=tasks/cleanup_profile+device.yml) 44071 1727204714.35953: extending task lists for all hosts with included blocks 44071 1727204714.36853: done extending task lists 44071 1727204714.36854: done processing included files 44071 1727204714.36855: results queue empty 44071 1727204714.36855: checking for any_errors_fatal 44071 1727204714.36858: done checking for any_errors_fatal 44071 1727204714.36859: checking for max_fail_percentage 44071 1727204714.36860: done checking for max_fail_percentage 44071 1727204714.36860: checking to see if all hosts have failed and the running result is not ok 44071 1727204714.36861: done checking to see if all hosts have failed 44071 1727204714.36861: getting the remaining hosts for this loop 44071 1727204714.36862: done getting the remaining hosts for this loop 44071 1727204714.36864: getting the next task for host managed-node2 44071 1727204714.36870: done getting next task for host managed-node2 44071 1727204714.36872: ^ task is: TASK: Cleanup profile and device 44071 1727204714.36874: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204714.36876: getting variables 44071 1727204714.36877: in VariableManager get_vars() 44071 1727204714.36891: Calling all_inventory to load vars for managed-node2 44071 1727204714.36893: Calling groups_inventory to load vars for managed-node2 44071 1727204714.36894: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204714.36900: Calling all_plugins_play to load vars for managed-node2 44071 1727204714.36902: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204714.36904: Calling groups_plugins_play to load vars for managed-node2 44071 1727204714.37926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204714.39120: done with get_vars() 44071 1727204714.39152: done getting variables 44071 1727204714.39194: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Tuesday 24 September 2024 15:05:14 -0400 (0:00:00.081) 0:02:06.708 ***** 44071 1727204714.39222: entering _queue_task() for managed-node2/shell 44071 1727204714.39539: worker is 1 (out of 1 available) 44071 1727204714.39555: exiting _queue_task() for managed-node2/shell 44071 1727204714.39571: done queuing things up, now waiting for results queue to drain 44071 1727204714.39573: waiting for pending results... 44071 1727204714.39790: running TaskExecutor() for managed-node2/TASK: Cleanup profile and device 44071 1727204714.39889: in run() - task 127b8e07-fff9-c964-7471-00000000200b 44071 1727204714.39902: variable 'ansible_search_path' from source: unknown 44071 1727204714.39908: variable 'ansible_search_path' from source: unknown 44071 1727204714.39946: calling self._execute() 44071 1727204714.40037: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204714.40045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204714.40055: variable 'omit' from source: magic vars 44071 1727204714.40402: variable 'ansible_distribution_major_version' from source: facts 44071 1727204714.40413: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204714.40420: variable 'omit' from source: magic vars 44071 1727204714.40464: variable 'omit' from source: magic vars 44071 1727204714.40586: variable 'interface' from source: play vars 44071 1727204714.40604: variable 'omit' from source: magic vars 44071 1727204714.40646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204714.40680: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204714.40702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204714.40717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204714.40730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204714.40755: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204714.40759: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204714.40762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204714.40847: Set connection var ansible_connection to ssh 44071 1727204714.40851: Set connection var ansible_timeout to 10 44071 1727204714.40857: Set connection var ansible_pipelining to False 44071 1727204714.40863: Set connection var ansible_shell_type to sh 44071 1727204714.40870: Set connection var ansible_shell_executable to /bin/sh 44071 1727204714.40877: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204714.40898: variable 'ansible_shell_executable' from source: unknown 44071 1727204714.40902: variable 'ansible_connection' from source: unknown 44071 1727204714.40904: variable 'ansible_module_compression' from source: unknown 44071 1727204714.40909: variable 'ansible_shell_type' from source: unknown 44071 1727204714.40912: variable 'ansible_shell_executable' from source: unknown 44071 1727204714.40914: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204714.40917: variable 'ansible_pipelining' from source: unknown 44071 1727204714.40919: variable 'ansible_timeout' from source: unknown 44071 1727204714.40921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204714.41046: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204714.41054: variable 'omit' from source: magic vars 44071 1727204714.41059: starting attempt loop 44071 1727204714.41062: running the handler 44071 1727204714.41074: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204714.41090: _low_level_execute_command(): starting 44071 1727204714.41097: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204714.41689: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204714.41694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204714.41697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204714.41700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204714.41756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204714.41764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204714.41769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204714.41844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204714.43617: stdout chunk (state=3): >>>/root <<< 44071 1727204714.43721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204714.43791: stderr chunk (state=3): >>><<< 44071 1727204714.43795: stdout chunk (state=3): >>><<< 44071 1727204714.43819: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204714.43836: _low_level_execute_command(): starting 44071 1727204714.43840: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204714.4381764-51095-31598409181834 `" && echo ansible-tmp-1727204714.4381764-51095-31598409181834="` echo /root/.ansible/tmp/ansible-tmp-1727204714.4381764-51095-31598409181834 `" ) && sleep 0' 44071 1727204714.44355: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204714.44369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204714.44373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204714.44376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204714.44379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204714.44434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204714.44440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204714.44442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204714.44508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204714.46505: stdout chunk (state=3): >>>ansible-tmp-1727204714.4381764-51095-31598409181834=/root/.ansible/tmp/ansible-tmp-1727204714.4381764-51095-31598409181834 <<< 44071 1727204714.46608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204714.46672: stderr chunk (state=3): >>><<< 44071 1727204714.46675: stdout chunk (state=3): >>><<< 44071 1727204714.46696: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204714.4381764-51095-31598409181834=/root/.ansible/tmp/ansible-tmp-1727204714.4381764-51095-31598409181834 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204714.46728: variable 'ansible_module_compression' from source: unknown 44071 1727204714.46784: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204714.46821: variable 'ansible_facts' from source: unknown 44071 1727204714.46885: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204714.4381764-51095-31598409181834/AnsiballZ_command.py 44071 1727204714.47005: Sending initial data 44071 1727204714.47008: Sent initial data (155 bytes) 44071 1727204714.47521: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204714.47525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204714.47528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204714.47530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204714.47582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204714.47605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204714.47669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204714.49275: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204714.49338: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204714.49410: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpf4yit0wu /root/.ansible/tmp/ansible-tmp-1727204714.4381764-51095-31598409181834/AnsiballZ_command.py <<< 44071 1727204714.49414: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204714.4381764-51095-31598409181834/AnsiballZ_command.py" <<< 44071 1727204714.49480: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpf4yit0wu" to remote "/root/.ansible/tmp/ansible-tmp-1727204714.4381764-51095-31598409181834/AnsiballZ_command.py" <<< 44071 1727204714.49483: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204714.4381764-51095-31598409181834/AnsiballZ_command.py" <<< 44071 1727204714.50160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204714.50241: stderr chunk (state=3): >>><<< 44071 1727204714.50245: stdout chunk (state=3): >>><<< 44071 1727204714.50267: done transferring module to remote 44071 1727204714.50279: _low_level_execute_command(): starting 44071 1727204714.50285: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204714.4381764-51095-31598409181834/ /root/.ansible/tmp/ansible-tmp-1727204714.4381764-51095-31598409181834/AnsiballZ_command.py && sleep 0' 44071 1727204714.50761: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204714.50788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204714.50797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204714.50855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204714.50862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204714.50928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204714.52743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204714.52808: stderr chunk (state=3): >>><<< 44071 1727204714.52811: stdout chunk (state=3): >>><<< 44071 1727204714.52827: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204714.52832: _low_level_execute_command(): starting 44071 1727204714.52838: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204714.4381764-51095-31598409181834/AnsiballZ_command.py && sleep 0' 44071 1727204714.53349: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204714.53354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204714.53357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204714.53359: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204714.53410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204714.53413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204714.53416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204714.53497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204714.73316: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:05:14.696606", "end": "2024-09-24 15:05:14.731394", "delta": "0:00:00.034788", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204714.74998: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. <<< 44071 1727204714.75002: stdout chunk (state=3): >>><<< 44071 1727204714.75004: stderr chunk (state=3): >>><<< 44071 1727204714.75027: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:05:14.696606", "end": "2024-09-24 15:05:14.731394", "delta": "0:00:00.034788", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. 44071 1727204714.75081: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204714.4381764-51095-31598409181834/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204714.75171: _low_level_execute_command(): starting 44071 1727204714.75175: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204714.4381764-51095-31598409181834/ > /dev/null 2>&1 && sleep 0' 44071 1727204714.75806: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204714.75828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204714.75843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204714.75945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204714.75969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204714.75987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204714.76012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204714.76112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204714.78053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204714.78292: stderr chunk (state=3): >>><<< 44071 1727204714.78295: stdout chunk (state=3): >>><<< 44071 1727204714.78298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204714.78301: handler run complete 44071 1727204714.78303: Evaluated conditional (False): False 44071 1727204714.78305: attempt loop complete, returning result 44071 1727204714.78308: _execute() done 44071 1727204714.78310: dumping result to json 44071 1727204714.78312: done dumping result, returning 44071 1727204714.78314: done running TaskExecutor() for managed-node2/TASK: Cleanup profile and device [127b8e07-fff9-c964-7471-00000000200b] 44071 1727204714.78316: sending task result for task 127b8e07-fff9-c964-7471-00000000200b 44071 1727204714.78398: done sending task result for task 127b8e07-fff9-c964-7471-00000000200b 44071 1727204714.78401: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.034788", "end": "2024-09-24 15:05:14.731394", "rc": 1, "start": "2024-09-24 15:05:14.696606" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 44071 1727204714.78475: no more pending results, returning what we have 44071 1727204714.78479: results queue empty 44071 1727204714.78480: checking for any_errors_fatal 44071 1727204714.78481: done checking for any_errors_fatal 44071 1727204714.78482: checking for max_fail_percentage 44071 1727204714.78484: done checking for max_fail_percentage 44071 1727204714.78485: checking to see if all hosts have failed and the running result is not ok 44071 1727204714.78486: done checking to see if all hosts have failed 44071 1727204714.78486: getting the remaining hosts for this loop 44071 1727204714.78488: done getting the remaining hosts for this loop 44071 1727204714.78493: getting the next task for host managed-node2 44071 1727204714.78506: done getting next task for host managed-node2 44071 1727204714.78509: ^ task is: TASK: Include the task 'run_test.yml' 44071 1727204714.78511: ^ state is: HOST STATE: block=8, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204714.78514: getting variables 44071 1727204714.78516: in VariableManager get_vars() 44071 1727204714.78560: Calling all_inventory to load vars for managed-node2 44071 1727204714.78562: Calling groups_inventory to load vars for managed-node2 44071 1727204714.78769: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204714.78783: Calling all_plugins_play to load vars for managed-node2 44071 1727204714.78786: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204714.78788: Calling groups_plugins_play to load vars for managed-node2 44071 1727204714.80489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204714.82724: done with get_vars() 44071 1727204714.82768: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:124 Tuesday 24 September 2024 15:05:14 -0400 (0:00:00.436) 0:02:07.145 ***** 44071 1727204714.82880: entering _queue_task() for managed-node2/include_tasks 44071 1727204714.83288: worker is 1 (out of 1 available) 44071 1727204714.83304: exiting _queue_task() for managed-node2/include_tasks 44071 1727204714.83320: done queuing things up, now waiting for results queue to drain 44071 1727204714.83322: waiting for pending results... 44071 1727204714.83662: running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' 44071 1727204714.83783: in run() - task 127b8e07-fff9-c964-7471-000000000017 44071 1727204714.83811: variable 'ansible_search_path' from source: unknown 44071 1727204714.83903: calling self._execute() 44071 1727204714.83980: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204714.83994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204714.84013: variable 'omit' from source: magic vars 44071 1727204714.84474: variable 'ansible_distribution_major_version' from source: facts 44071 1727204714.84497: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204714.84509: _execute() done 44071 1727204714.84672: dumping result to json 44071 1727204714.84676: done dumping result, returning 44071 1727204714.84679: done running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' [127b8e07-fff9-c964-7471-000000000017] 44071 1727204714.84681: sending task result for task 127b8e07-fff9-c964-7471-000000000017 44071 1727204714.84783: done sending task result for task 127b8e07-fff9-c964-7471-000000000017 44071 1727204714.84787: WORKER PROCESS EXITING 44071 1727204714.84821: no more pending results, returning what we have 44071 1727204714.84827: in VariableManager get_vars() 44071 1727204714.84886: Calling all_inventory to load vars for managed-node2 44071 1727204714.84890: Calling groups_inventory to load vars for managed-node2 44071 1727204714.84894: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204714.84910: Calling all_plugins_play to load vars for managed-node2 44071 1727204714.84915: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204714.84918: Calling groups_plugins_play to load vars for managed-node2 44071 1727204714.87110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204714.89329: done with get_vars() 44071 1727204714.89370: variable 'ansible_search_path' from source: unknown 44071 1727204714.89388: we have included files to process 44071 1727204714.89390: generating all_blocks data 44071 1727204714.89391: done generating all_blocks data 44071 1727204714.89396: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204714.89398: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204714.89400: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 44071 1727204714.89863: in VariableManager get_vars() 44071 1727204714.89890: done with get_vars() 44071 1727204714.89933: in VariableManager get_vars() 44071 1727204714.89953: done with get_vars() 44071 1727204714.90000: in VariableManager get_vars() 44071 1727204714.90020: done with get_vars() 44071 1727204714.90062: in VariableManager get_vars() 44071 1727204714.90083: done with get_vars() 44071 1727204714.90125: in VariableManager get_vars() 44071 1727204714.90143: done with get_vars() 44071 1727204714.90617: in VariableManager get_vars() 44071 1727204714.90640: done with get_vars() 44071 1727204714.90654: done processing included file 44071 1727204714.90656: iterating over new_blocks loaded from include file 44071 1727204714.90657: in VariableManager get_vars() 44071 1727204714.90673: done with get_vars() 44071 1727204714.90674: filtering new block on tags 44071 1727204714.90789: done filtering new block on tags 44071 1727204714.90793: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node2 44071 1727204714.90799: extending task lists for all hosts with included blocks 44071 1727204714.90840: done extending task lists 44071 1727204714.90841: done processing included files 44071 1727204714.90842: results queue empty 44071 1727204714.90842: checking for any_errors_fatal 44071 1727204714.90848: done checking for any_errors_fatal 44071 1727204714.90849: checking for max_fail_percentage 44071 1727204714.90850: done checking for max_fail_percentage 44071 1727204714.90851: checking to see if all hosts have failed and the running result is not ok 44071 1727204714.90852: done checking to see if all hosts have failed 44071 1727204714.90853: getting the remaining hosts for this loop 44071 1727204714.90854: done getting the remaining hosts for this loop 44071 1727204714.90857: getting the next task for host managed-node2 44071 1727204714.90861: done getting next task for host managed-node2 44071 1727204714.90863: ^ task is: TASK: TEST: {{ lsr_description }} 44071 1727204714.90867: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204714.90870: getting variables 44071 1727204714.90870: in VariableManager get_vars() 44071 1727204714.90882: Calling all_inventory to load vars for managed-node2 44071 1727204714.90885: Calling groups_inventory to load vars for managed-node2 44071 1727204714.90887: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204714.90894: Calling all_plugins_play to load vars for managed-node2 44071 1727204714.90897: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204714.90900: Calling groups_plugins_play to load vars for managed-node2 44071 1727204714.92516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204714.94939: done with get_vars() 44071 1727204714.94986: done getting variables 44071 1727204714.95054: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204714.95199: variable 'lsr_description' from source: include params TASK [TEST: I will not get an error when I try to remove an absent profile] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 15:05:14 -0400 (0:00:00.123) 0:02:07.268 ***** 44071 1727204714.95246: entering _queue_task() for managed-node2/debug 44071 1727204714.95718: worker is 1 (out of 1 available) 44071 1727204714.95735: exiting _queue_task() for managed-node2/debug 44071 1727204714.95750: done queuing things up, now waiting for results queue to drain 44071 1727204714.95752: waiting for pending results... 44071 1727204714.96187: running TaskExecutor() for managed-node2/TASK: TEST: I will not get an error when I try to remove an absent profile 44071 1727204714.96215: in run() - task 127b8e07-fff9-c964-7471-0000000020ad 44071 1727204714.96258: variable 'ansible_search_path' from source: unknown 44071 1727204714.96270: variable 'ansible_search_path' from source: unknown 44071 1727204714.96320: calling self._execute() 44071 1727204714.96458: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204714.96475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204714.96487: variable 'omit' from source: magic vars 44071 1727204714.96820: variable 'ansible_distribution_major_version' from source: facts 44071 1727204714.96832: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204714.96840: variable 'omit' from source: magic vars 44071 1727204714.96871: variable 'omit' from source: magic vars 44071 1727204714.96953: variable 'lsr_description' from source: include params 44071 1727204714.96971: variable 'omit' from source: magic vars 44071 1727204714.97010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204714.97044: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204714.97062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204714.97081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204714.97093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204714.97119: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204714.97123: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204714.97125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204714.97210: Set connection var ansible_connection to ssh 44071 1727204714.97218: Set connection var ansible_timeout to 10 44071 1727204714.97224: Set connection var ansible_pipelining to False 44071 1727204714.97229: Set connection var ansible_shell_type to sh 44071 1727204714.97237: Set connection var ansible_shell_executable to /bin/sh 44071 1727204714.97245: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204714.97263: variable 'ansible_shell_executable' from source: unknown 44071 1727204714.97268: variable 'ansible_connection' from source: unknown 44071 1727204714.97271: variable 'ansible_module_compression' from source: unknown 44071 1727204714.97274: variable 'ansible_shell_type' from source: unknown 44071 1727204714.97277: variable 'ansible_shell_executable' from source: unknown 44071 1727204714.97280: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204714.97282: variable 'ansible_pipelining' from source: unknown 44071 1727204714.97285: variable 'ansible_timeout' from source: unknown 44071 1727204714.97290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204714.97411: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204714.97422: variable 'omit' from source: magic vars 44071 1727204714.97425: starting attempt loop 44071 1727204714.97428: running the handler 44071 1727204714.97476: handler run complete 44071 1727204714.97488: attempt loop complete, returning result 44071 1727204714.97491: _execute() done 44071 1727204714.97495: dumping result to json 44071 1727204714.97497: done dumping result, returning 44071 1727204714.97504: done running TaskExecutor() for managed-node2/TASK: TEST: I will not get an error when I try to remove an absent profile [127b8e07-fff9-c964-7471-0000000020ad] 44071 1727204714.97509: sending task result for task 127b8e07-fff9-c964-7471-0000000020ad 44071 1727204714.97607: done sending task result for task 127b8e07-fff9-c964-7471-0000000020ad 44071 1727204714.97610: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ########## I will not get an error when I try to remove an absent profile ########## 44071 1727204714.97684: no more pending results, returning what we have 44071 1727204714.97688: results queue empty 44071 1727204714.97688: checking for any_errors_fatal 44071 1727204714.97690: done checking for any_errors_fatal 44071 1727204714.97690: checking for max_fail_percentage 44071 1727204714.97692: done checking for max_fail_percentage 44071 1727204714.97693: checking to see if all hosts have failed and the running result is not ok 44071 1727204714.97694: done checking to see if all hosts have failed 44071 1727204714.97694: getting the remaining hosts for this loop 44071 1727204714.97701: done getting the remaining hosts for this loop 44071 1727204714.97707: getting the next task for host managed-node2 44071 1727204714.97714: done getting next task for host managed-node2 44071 1727204714.97717: ^ task is: TASK: Show item 44071 1727204714.97722: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204714.97726: getting variables 44071 1727204714.97728: in VariableManager get_vars() 44071 1727204714.97774: Calling all_inventory to load vars for managed-node2 44071 1727204714.97776: Calling groups_inventory to load vars for managed-node2 44071 1727204714.97780: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204714.97792: Calling all_plugins_play to load vars for managed-node2 44071 1727204714.97795: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204714.97797: Calling groups_plugins_play to load vars for managed-node2 44071 1727204714.99245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204715.00788: done with get_vars() 44071 1727204715.00813: done getting variables 44071 1727204715.00867: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 15:05:15 -0400 (0:00:00.056) 0:02:07.325 ***** 44071 1727204715.00894: entering _queue_task() for managed-node2/debug 44071 1727204715.01205: worker is 1 (out of 1 available) 44071 1727204715.01223: exiting _queue_task() for managed-node2/debug 44071 1727204715.01237: done queuing things up, now waiting for results queue to drain 44071 1727204715.01239: waiting for pending results... 44071 1727204715.01455: running TaskExecutor() for managed-node2/TASK: Show item 44071 1727204715.01542: in run() - task 127b8e07-fff9-c964-7471-0000000020ae 44071 1727204715.01556: variable 'ansible_search_path' from source: unknown 44071 1727204715.01560: variable 'ansible_search_path' from source: unknown 44071 1727204715.01611: variable 'omit' from source: magic vars 44071 1727204715.01754: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.01762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.01774: variable 'omit' from source: magic vars 44071 1727204715.02281: variable 'ansible_distribution_major_version' from source: facts 44071 1727204715.02286: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204715.02289: variable 'omit' from source: magic vars 44071 1727204715.02291: variable 'omit' from source: magic vars 44071 1727204715.02308: variable 'item' from source: unknown 44071 1727204715.02393: variable 'item' from source: unknown 44071 1727204715.02419: variable 'omit' from source: magic vars 44071 1727204715.02475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204715.02519: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204715.02553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204715.02580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.02600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.02639: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204715.02650: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.02659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.02784: Set connection var ansible_connection to ssh 44071 1727204715.02798: Set connection var ansible_timeout to 10 44071 1727204715.02811: Set connection var ansible_pipelining to False 44071 1727204715.02825: Set connection var ansible_shell_type to sh 44071 1727204715.02840: Set connection var ansible_shell_executable to /bin/sh 44071 1727204715.02854: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204715.02972: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.02976: variable 'ansible_connection' from source: unknown 44071 1727204715.02979: variable 'ansible_module_compression' from source: unknown 44071 1727204715.02981: variable 'ansible_shell_type' from source: unknown 44071 1727204715.02983: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.02985: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.02987: variable 'ansible_pipelining' from source: unknown 44071 1727204715.02989: variable 'ansible_timeout' from source: unknown 44071 1727204715.02991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.03101: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204715.03118: variable 'omit' from source: magic vars 44071 1727204715.03129: starting attempt loop 44071 1727204715.03140: running the handler 44071 1727204715.03201: variable 'lsr_description' from source: include params 44071 1727204715.03297: variable 'lsr_description' from source: include params 44071 1727204715.03317: handler run complete 44071 1727204715.03347: attempt loop complete, returning result 44071 1727204715.03376: variable 'item' from source: unknown 44071 1727204715.03471: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I will not get an error when I try to remove an absent profile" } 44071 1727204715.03688: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.03692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.03702: variable 'omit' from source: magic vars 44071 1727204715.03857: variable 'ansible_distribution_major_version' from source: facts 44071 1727204715.03862: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204715.03867: variable 'omit' from source: magic vars 44071 1727204715.03879: variable 'omit' from source: magic vars 44071 1727204715.03912: variable 'item' from source: unknown 44071 1727204715.03974: variable 'item' from source: unknown 44071 1727204715.03986: variable 'omit' from source: magic vars 44071 1727204715.04004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204715.04014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.04021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.04036: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204715.04039: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.04042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.04100: Set connection var ansible_connection to ssh 44071 1727204715.04104: Set connection var ansible_timeout to 10 44071 1727204715.04110: Set connection var ansible_pipelining to False 44071 1727204715.04117: Set connection var ansible_shell_type to sh 44071 1727204715.04120: Set connection var ansible_shell_executable to /bin/sh 44071 1727204715.04129: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204715.04147: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.04150: variable 'ansible_connection' from source: unknown 44071 1727204715.04153: variable 'ansible_module_compression' from source: unknown 44071 1727204715.04156: variable 'ansible_shell_type' from source: unknown 44071 1727204715.04158: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.04160: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.04163: variable 'ansible_pipelining' from source: unknown 44071 1727204715.04167: variable 'ansible_timeout' from source: unknown 44071 1727204715.04173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.04247: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204715.04257: variable 'omit' from source: magic vars 44071 1727204715.04261: starting attempt loop 44071 1727204715.04263: running the handler 44071 1727204715.04289: variable 'lsr_setup' from source: include params 44071 1727204715.04352: variable 'lsr_setup' from source: include params 44071 1727204715.04398: handler run complete 44071 1727204715.04412: attempt loop complete, returning result 44071 1727204715.04427: variable 'item' from source: unknown 44071 1727204715.04485: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove+down_profile.yml" ] } 44071 1727204715.04595: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.04599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.04603: variable 'omit' from source: magic vars 44071 1727204715.04723: variable 'ansible_distribution_major_version' from source: facts 44071 1727204715.04727: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204715.04729: variable 'omit' from source: magic vars 44071 1727204715.04739: variable 'omit' from source: magic vars 44071 1727204715.04769: variable 'item' from source: unknown 44071 1727204715.04821: variable 'item' from source: unknown 44071 1727204715.04837: variable 'omit' from source: magic vars 44071 1727204715.04853: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204715.04861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.04868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.04880: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204715.04883: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.04885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.04937: Set connection var ansible_connection to ssh 44071 1727204715.04949: Set connection var ansible_timeout to 10 44071 1727204715.04954: Set connection var ansible_pipelining to False 44071 1727204715.04960: Set connection var ansible_shell_type to sh 44071 1727204715.04966: Set connection var ansible_shell_executable to /bin/sh 44071 1727204715.04974: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204715.04990: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.04992: variable 'ansible_connection' from source: unknown 44071 1727204715.04995: variable 'ansible_module_compression' from source: unknown 44071 1727204715.04997: variable 'ansible_shell_type' from source: unknown 44071 1727204715.05000: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.05003: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.05007: variable 'ansible_pipelining' from source: unknown 44071 1727204715.05010: variable 'ansible_timeout' from source: unknown 44071 1727204715.05014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.05093: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204715.05100: variable 'omit' from source: magic vars 44071 1727204715.05103: starting attempt loop 44071 1727204715.05106: running the handler 44071 1727204715.05123: variable 'lsr_test' from source: include params 44071 1727204715.05180: variable 'lsr_test' from source: include params 44071 1727204715.05196: handler run complete 44071 1727204715.05207: attempt loop complete, returning result 44071 1727204715.05221: variable 'item' from source: unknown 44071 1727204715.05274: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 44071 1727204715.05370: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.05373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.05376: variable 'omit' from source: magic vars 44071 1727204715.05522: variable 'ansible_distribution_major_version' from source: facts 44071 1727204715.05529: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204715.05534: variable 'omit' from source: magic vars 44071 1727204715.05554: variable 'omit' from source: magic vars 44071 1727204715.05595: variable 'item' from source: unknown 44071 1727204715.05672: variable 'item' from source: unknown 44071 1727204715.05679: variable 'omit' from source: magic vars 44071 1727204715.05700: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204715.05741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.05744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.05747: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204715.05749: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.05751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.05870: Set connection var ansible_connection to ssh 44071 1727204715.05873: Set connection var ansible_timeout to 10 44071 1727204715.05875: Set connection var ansible_pipelining to False 44071 1727204715.05877: Set connection var ansible_shell_type to sh 44071 1727204715.05879: Set connection var ansible_shell_executable to /bin/sh 44071 1727204715.05881: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204715.05888: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.05894: variable 'ansible_connection' from source: unknown 44071 1727204715.05900: variable 'ansible_module_compression' from source: unknown 44071 1727204715.05906: variable 'ansible_shell_type' from source: unknown 44071 1727204715.05912: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.05918: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.05925: variable 'ansible_pipelining' from source: unknown 44071 1727204715.05932: variable 'ansible_timeout' from source: unknown 44071 1727204715.05942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.06048: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204715.06170: variable 'omit' from source: magic vars 44071 1727204715.06174: starting attempt loop 44071 1727204715.06176: running the handler 44071 1727204715.06178: variable 'lsr_assert' from source: include params 44071 1727204715.06180: variable 'lsr_assert' from source: include params 44071 1727204715.06198: handler run complete 44071 1727204715.06221: attempt loop complete, returning result 44071 1727204715.06243: variable 'item' from source: unknown 44071 1727204715.06320: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml", "tasks/get_NetworkManager_NVR.yml" ] } 44071 1727204715.06458: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.06462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.06464: variable 'omit' from source: magic vars 44071 1727204715.06770: variable 'ansible_distribution_major_version' from source: facts 44071 1727204715.06779: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204715.06782: variable 'omit' from source: magic vars 44071 1727204715.06784: variable 'omit' from source: magic vars 44071 1727204715.06786: variable 'item' from source: unknown 44071 1727204715.06788: variable 'item' from source: unknown 44071 1727204715.06807: variable 'omit' from source: magic vars 44071 1727204715.06833: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204715.06847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.06858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.06881: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204715.06889: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.06896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.06980: Set connection var ansible_connection to ssh 44071 1727204715.06992: Set connection var ansible_timeout to 10 44071 1727204715.07003: Set connection var ansible_pipelining to False 44071 1727204715.07013: Set connection var ansible_shell_type to sh 44071 1727204715.07024: Set connection var ansible_shell_executable to /bin/sh 44071 1727204715.07037: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204715.07064: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.07076: variable 'ansible_connection' from source: unknown 44071 1727204715.07083: variable 'ansible_module_compression' from source: unknown 44071 1727204715.07092: variable 'ansible_shell_type' from source: unknown 44071 1727204715.07098: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.07105: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.07113: variable 'ansible_pipelining' from source: unknown 44071 1727204715.07120: variable 'ansible_timeout' from source: unknown 44071 1727204715.07128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.07231: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204715.07246: variable 'omit' from source: magic vars 44071 1727204715.07254: starting attempt loop 44071 1727204715.07260: running the handler 44071 1727204715.07470: variable 'lsr_assert_when' from source: include params 44071 1727204715.07474: variable 'lsr_assert_when' from source: include params 44071 1727204715.07476: variable 'network_provider' from source: set_fact 44071 1727204715.07515: handler run complete 44071 1727204715.07538: attempt loop complete, returning result 44071 1727204715.07560: variable 'item' from source: unknown 44071 1727204715.07634: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 44071 1727204715.07836: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.07840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.07856: variable 'omit' from source: magic vars 44071 1727204715.07997: variable 'ansible_distribution_major_version' from source: facts 44071 1727204715.08003: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204715.08007: variable 'omit' from source: magic vars 44071 1727204715.08023: variable 'omit' from source: magic vars 44071 1727204715.08051: variable 'item' from source: unknown 44071 1727204715.08111: variable 'item' from source: unknown 44071 1727204715.08125: variable 'omit' from source: magic vars 44071 1727204715.08144: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204715.08151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.08157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.08169: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204715.08172: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.08178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.08231: Set connection var ansible_connection to ssh 44071 1727204715.08237: Set connection var ansible_timeout to 10 44071 1727204715.08241: Set connection var ansible_pipelining to False 44071 1727204715.08247: Set connection var ansible_shell_type to sh 44071 1727204715.08252: Set connection var ansible_shell_executable to /bin/sh 44071 1727204715.08259: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204715.08276: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.08279: variable 'ansible_connection' from source: unknown 44071 1727204715.08282: variable 'ansible_module_compression' from source: unknown 44071 1727204715.08284: variable 'ansible_shell_type' from source: unknown 44071 1727204715.08288: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.08290: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.08293: variable 'ansible_pipelining' from source: unknown 44071 1727204715.08298: variable 'ansible_timeout' from source: unknown 44071 1727204715.08302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.08376: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204715.08383: variable 'omit' from source: magic vars 44071 1727204715.08386: starting attempt loop 44071 1727204715.08389: running the handler 44071 1727204715.08408: variable 'lsr_fail_debug' from source: play vars 44071 1727204715.08463: variable 'lsr_fail_debug' from source: play vars 44071 1727204715.08479: handler run complete 44071 1727204715.08491: attempt loop complete, returning result 44071 1727204715.08503: variable 'item' from source: unknown 44071 1727204715.08555: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 44071 1727204715.08657: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.08661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.08663: variable 'omit' from source: magic vars 44071 1727204715.08771: variable 'ansible_distribution_major_version' from source: facts 44071 1727204715.08774: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204715.08784: variable 'omit' from source: magic vars 44071 1727204715.08794: variable 'omit' from source: magic vars 44071 1727204715.08824: variable 'item' from source: unknown 44071 1727204715.08872: variable 'item' from source: unknown 44071 1727204715.08884: variable 'omit' from source: magic vars 44071 1727204715.08903: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204715.08912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.08917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.08928: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204715.08931: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.08936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.08988: Set connection var ansible_connection to ssh 44071 1727204715.08993: Set connection var ansible_timeout to 10 44071 1727204715.09003: Set connection var ansible_pipelining to False 44071 1727204715.09008: Set connection var ansible_shell_type to sh 44071 1727204715.09010: Set connection var ansible_shell_executable to /bin/sh 44071 1727204715.09018: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204715.09035: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.09038: variable 'ansible_connection' from source: unknown 44071 1727204715.09041: variable 'ansible_module_compression' from source: unknown 44071 1727204715.09043: variable 'ansible_shell_type' from source: unknown 44071 1727204715.09045: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.09047: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.09050: variable 'ansible_pipelining' from source: unknown 44071 1727204715.09052: variable 'ansible_timeout' from source: unknown 44071 1727204715.09057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.09132: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204715.09140: variable 'omit' from source: magic vars 44071 1727204715.09144: starting attempt loop 44071 1727204715.09147: running the handler 44071 1727204715.09164: variable 'lsr_cleanup' from source: include params 44071 1727204715.09222: variable 'lsr_cleanup' from source: include params 44071 1727204715.09238: handler run complete 44071 1727204715.09249: attempt loop complete, returning result 44071 1727204715.09262: variable 'item' from source: unknown 44071 1727204715.09310: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml", "tasks/check_network_dns.yml" ] } 44071 1727204715.09420: dumping result to json 44071 1727204715.09422: done dumping result, returning 44071 1727204715.09424: done running TaskExecutor() for managed-node2/TASK: Show item [127b8e07-fff9-c964-7471-0000000020ae] 44071 1727204715.09426: sending task result for task 127b8e07-fff9-c964-7471-0000000020ae 44071 1727204715.09493: done sending task result for task 127b8e07-fff9-c964-7471-0000000020ae 44071 1727204715.09495: WORKER PROCESS EXITING 44071 1727204715.09550: no more pending results, returning what we have 44071 1727204715.09554: results queue empty 44071 1727204715.09555: checking for any_errors_fatal 44071 1727204715.09565: done checking for any_errors_fatal 44071 1727204715.09567: checking for max_fail_percentage 44071 1727204715.09569: done checking for max_fail_percentage 44071 1727204715.09570: checking to see if all hosts have failed and the running result is not ok 44071 1727204715.09571: done checking to see if all hosts have failed 44071 1727204715.09572: getting the remaining hosts for this loop 44071 1727204715.09573: done getting the remaining hosts for this loop 44071 1727204715.09577: getting the next task for host managed-node2 44071 1727204715.09591: done getting next task for host managed-node2 44071 1727204715.09595: ^ task is: TASK: Include the task 'show_interfaces.yml' 44071 1727204715.09597: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204715.09601: getting variables 44071 1727204715.09602: in VariableManager get_vars() 44071 1727204715.09647: Calling all_inventory to load vars for managed-node2 44071 1727204715.09649: Calling groups_inventory to load vars for managed-node2 44071 1727204715.09653: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204715.09668: Calling all_plugins_play to load vars for managed-node2 44071 1727204715.09671: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204715.09674: Calling groups_plugins_play to load vars for managed-node2 44071 1727204715.11296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204715.12528: done with get_vars() 44071 1727204715.12563: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 15:05:15 -0400 (0:00:00.117) 0:02:07.442 ***** 44071 1727204715.12647: entering _queue_task() for managed-node2/include_tasks 44071 1727204715.12956: worker is 1 (out of 1 available) 44071 1727204715.12978: exiting _queue_task() for managed-node2/include_tasks 44071 1727204715.12993: done queuing things up, now waiting for results queue to drain 44071 1727204715.12995: waiting for pending results... 44071 1727204715.13211: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 44071 1727204715.13292: in run() - task 127b8e07-fff9-c964-7471-0000000020af 44071 1727204715.13306: variable 'ansible_search_path' from source: unknown 44071 1727204715.13310: variable 'ansible_search_path' from source: unknown 44071 1727204715.13347: calling self._execute() 44071 1727204715.13437: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.13445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.13455: variable 'omit' from source: magic vars 44071 1727204715.13787: variable 'ansible_distribution_major_version' from source: facts 44071 1727204715.13800: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204715.13805: _execute() done 44071 1727204715.13811: dumping result to json 44071 1727204715.13813: done dumping result, returning 44071 1727204715.13821: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-c964-7471-0000000020af] 44071 1727204715.13825: sending task result for task 127b8e07-fff9-c964-7471-0000000020af 44071 1727204715.13931: done sending task result for task 127b8e07-fff9-c964-7471-0000000020af 44071 1727204715.13935: WORKER PROCESS EXITING 44071 1727204715.13971: no more pending results, returning what we have 44071 1727204715.13977: in VariableManager get_vars() 44071 1727204715.14030: Calling all_inventory to load vars for managed-node2 44071 1727204715.14034: Calling groups_inventory to load vars for managed-node2 44071 1727204715.14038: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204715.14055: Calling all_plugins_play to load vars for managed-node2 44071 1727204715.14059: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204715.14062: Calling groups_plugins_play to load vars for managed-node2 44071 1727204715.15311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204715.17680: done with get_vars() 44071 1727204715.17723: variable 'ansible_search_path' from source: unknown 44071 1727204715.17724: variable 'ansible_search_path' from source: unknown 44071 1727204715.17777: we have included files to process 44071 1727204715.17779: generating all_blocks data 44071 1727204715.17780: done generating all_blocks data 44071 1727204715.17787: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204715.17789: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204715.17792: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 44071 1727204715.17916: in VariableManager get_vars() 44071 1727204715.17943: done with get_vars() 44071 1727204715.18079: done processing included file 44071 1727204715.18081: iterating over new_blocks loaded from include file 44071 1727204715.18083: in VariableManager get_vars() 44071 1727204715.18103: done with get_vars() 44071 1727204715.18105: filtering new block on tags 44071 1727204715.18145: done filtering new block on tags 44071 1727204715.18148: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 44071 1727204715.18154: extending task lists for all hosts with included blocks 44071 1727204715.18653: done extending task lists 44071 1727204715.18654: done processing included files 44071 1727204715.18655: results queue empty 44071 1727204715.18656: checking for any_errors_fatal 44071 1727204715.18663: done checking for any_errors_fatal 44071 1727204715.18664: checking for max_fail_percentage 44071 1727204715.18667: done checking for max_fail_percentage 44071 1727204715.18668: checking to see if all hosts have failed and the running result is not ok 44071 1727204715.18669: done checking to see if all hosts have failed 44071 1727204715.18670: getting the remaining hosts for this loop 44071 1727204715.18671: done getting the remaining hosts for this loop 44071 1727204715.18674: getting the next task for host managed-node2 44071 1727204715.18679: done getting next task for host managed-node2 44071 1727204715.18681: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 44071 1727204715.18685: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204715.18688: getting variables 44071 1727204715.18689: in VariableManager get_vars() 44071 1727204715.18703: Calling all_inventory to load vars for managed-node2 44071 1727204715.18706: Calling groups_inventory to load vars for managed-node2 44071 1727204715.18708: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204715.18716: Calling all_plugins_play to load vars for managed-node2 44071 1727204715.18718: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204715.18721: Calling groups_plugins_play to load vars for managed-node2 44071 1727204715.20401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204715.21609: done with get_vars() 44071 1727204715.21643: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:05:15 -0400 (0:00:00.090) 0:02:07.533 ***** 44071 1727204715.21713: entering _queue_task() for managed-node2/include_tasks 44071 1727204715.22022: worker is 1 (out of 1 available) 44071 1727204715.22036: exiting _queue_task() for managed-node2/include_tasks 44071 1727204715.22052: done queuing things up, now waiting for results queue to drain 44071 1727204715.22054: waiting for pending results... 44071 1727204715.22272: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 44071 1727204715.22364: in run() - task 127b8e07-fff9-c964-7471-0000000020d6 44071 1727204715.22378: variable 'ansible_search_path' from source: unknown 44071 1727204715.22382: variable 'ansible_search_path' from source: unknown 44071 1727204715.22419: calling self._execute() 44071 1727204715.22514: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.22520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.22524: variable 'omit' from source: magic vars 44071 1727204715.22857: variable 'ansible_distribution_major_version' from source: facts 44071 1727204715.22871: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204715.22877: _execute() done 44071 1727204715.22882: dumping result to json 44071 1727204715.22884: done dumping result, returning 44071 1727204715.22892: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-c964-7471-0000000020d6] 44071 1727204715.22896: sending task result for task 127b8e07-fff9-c964-7471-0000000020d6 44071 1727204715.23002: done sending task result for task 127b8e07-fff9-c964-7471-0000000020d6 44071 1727204715.23005: WORKER PROCESS EXITING 44071 1727204715.23039: no more pending results, returning what we have 44071 1727204715.23044: in VariableManager get_vars() 44071 1727204715.23100: Calling all_inventory to load vars for managed-node2 44071 1727204715.23104: Calling groups_inventory to load vars for managed-node2 44071 1727204715.23108: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204715.23125: Calling all_plugins_play to load vars for managed-node2 44071 1727204715.23128: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204715.23131: Calling groups_plugins_play to load vars for managed-node2 44071 1727204715.24222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204715.25455: done with get_vars() 44071 1727204715.25485: variable 'ansible_search_path' from source: unknown 44071 1727204715.25487: variable 'ansible_search_path' from source: unknown 44071 1727204715.25518: we have included files to process 44071 1727204715.25519: generating all_blocks data 44071 1727204715.25521: done generating all_blocks data 44071 1727204715.25522: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204715.25523: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204715.25525: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 44071 1727204715.25738: done processing included file 44071 1727204715.25739: iterating over new_blocks loaded from include file 44071 1727204715.25741: in VariableManager get_vars() 44071 1727204715.25755: done with get_vars() 44071 1727204715.25756: filtering new block on tags 44071 1727204715.25786: done filtering new block on tags 44071 1727204715.25788: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 44071 1727204715.25792: extending task lists for all hosts with included blocks 44071 1727204715.25903: done extending task lists 44071 1727204715.25904: done processing included files 44071 1727204715.25905: results queue empty 44071 1727204715.25905: checking for any_errors_fatal 44071 1727204715.25908: done checking for any_errors_fatal 44071 1727204715.25908: checking for max_fail_percentage 44071 1727204715.25909: done checking for max_fail_percentage 44071 1727204715.25910: checking to see if all hosts have failed and the running result is not ok 44071 1727204715.25910: done checking to see if all hosts have failed 44071 1727204715.25911: getting the remaining hosts for this loop 44071 1727204715.25912: done getting the remaining hosts for this loop 44071 1727204715.25914: getting the next task for host managed-node2 44071 1727204715.25917: done getting next task for host managed-node2 44071 1727204715.25919: ^ task is: TASK: Gather current interface info 44071 1727204715.25922: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204715.25924: getting variables 44071 1727204715.25924: in VariableManager get_vars() 44071 1727204715.25934: Calling all_inventory to load vars for managed-node2 44071 1727204715.25936: Calling groups_inventory to load vars for managed-node2 44071 1727204715.25937: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204715.25942: Calling all_plugins_play to load vars for managed-node2 44071 1727204715.25944: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204715.25946: Calling groups_plugins_play to load vars for managed-node2 44071 1727204715.26928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204715.28135: done with get_vars() 44071 1727204715.28169: done getting variables 44071 1727204715.28210: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:05:15 -0400 (0:00:00.065) 0:02:07.598 ***** 44071 1727204715.28237: entering _queue_task() for managed-node2/command 44071 1727204715.28546: worker is 1 (out of 1 available) 44071 1727204715.28561: exiting _queue_task() for managed-node2/command 44071 1727204715.28579: done queuing things up, now waiting for results queue to drain 44071 1727204715.28581: waiting for pending results... 44071 1727204715.28794: running TaskExecutor() for managed-node2/TASK: Gather current interface info 44071 1727204715.28906: in run() - task 127b8e07-fff9-c964-7471-000000002111 44071 1727204715.28922: variable 'ansible_search_path' from source: unknown 44071 1727204715.28927: variable 'ansible_search_path' from source: unknown 44071 1727204715.28963: calling self._execute() 44071 1727204715.29053: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.29060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.29070: variable 'omit' from source: magic vars 44071 1727204715.29394: variable 'ansible_distribution_major_version' from source: facts 44071 1727204715.29406: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204715.29413: variable 'omit' from source: magic vars 44071 1727204715.29455: variable 'omit' from source: magic vars 44071 1727204715.29488: variable 'omit' from source: magic vars 44071 1727204715.29525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204715.29559: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204715.29583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204715.29597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.29608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.29635: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204715.29641: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.29653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.29754: Set connection var ansible_connection to ssh 44071 1727204715.29759: Set connection var ansible_timeout to 10 44071 1727204715.29767: Set connection var ansible_pipelining to False 44071 1727204715.29776: Set connection var ansible_shell_type to sh 44071 1727204715.29782: Set connection var ansible_shell_executable to /bin/sh 44071 1727204715.29789: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204715.29817: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.29821: variable 'ansible_connection' from source: unknown 44071 1727204715.29824: variable 'ansible_module_compression' from source: unknown 44071 1727204715.29827: variable 'ansible_shell_type' from source: unknown 44071 1727204715.29829: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.29832: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.29837: variable 'ansible_pipelining' from source: unknown 44071 1727204715.29840: variable 'ansible_timeout' from source: unknown 44071 1727204715.29842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.29955: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204715.29967: variable 'omit' from source: magic vars 44071 1727204715.29972: starting attempt loop 44071 1727204715.29975: running the handler 44071 1727204715.29990: _low_level_execute_command(): starting 44071 1727204715.29997: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204715.30584: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204715.30590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204715.30594: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204715.30597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204715.30649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204715.30653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204715.30655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204715.30742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204715.32513: stdout chunk (state=3): >>>/root <<< 44071 1727204715.32623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204715.32686: stderr chunk (state=3): >>><<< 44071 1727204715.32690: stdout chunk (state=3): >>><<< 44071 1727204715.32716: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204715.32729: _low_level_execute_command(): starting 44071 1727204715.32738: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204715.327125-51128-106531279022966 `" && echo ansible-tmp-1727204715.327125-51128-106531279022966="` echo /root/.ansible/tmp/ansible-tmp-1727204715.327125-51128-106531279022966 `" ) && sleep 0' 44071 1727204715.33228: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204715.33232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204715.33245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204715.33247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204715.33300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204715.33304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204715.33311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204715.33384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204715.35368: stdout chunk (state=3): >>>ansible-tmp-1727204715.327125-51128-106531279022966=/root/.ansible/tmp/ansible-tmp-1727204715.327125-51128-106531279022966 <<< 44071 1727204715.35479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204715.35574: stderr chunk (state=3): >>><<< 44071 1727204715.35577: stdout chunk (state=3): >>><<< 44071 1727204715.35771: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204715.327125-51128-106531279022966=/root/.ansible/tmp/ansible-tmp-1727204715.327125-51128-106531279022966 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204715.35775: variable 'ansible_module_compression' from source: unknown 44071 1727204715.35778: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204715.35780: variable 'ansible_facts' from source: unknown 44071 1727204715.35845: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204715.327125-51128-106531279022966/AnsiballZ_command.py 44071 1727204715.36020: Sending initial data 44071 1727204715.36029: Sent initial data (155 bytes) 44071 1727204715.36782: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204715.36852: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204715.36877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204715.36891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204715.36999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204715.38622: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204715.38728: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204715.38799: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpc5f8tncb /root/.ansible/tmp/ansible-tmp-1727204715.327125-51128-106531279022966/AnsiballZ_command.py <<< 44071 1727204715.38829: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204715.327125-51128-106531279022966/AnsiballZ_command.py" <<< 44071 1727204715.38899: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpc5f8tncb" to remote "/root/.ansible/tmp/ansible-tmp-1727204715.327125-51128-106531279022966/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204715.327125-51128-106531279022966/AnsiballZ_command.py" <<< 44071 1727204715.40050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204715.40054: stderr chunk (state=3): >>><<< 44071 1727204715.40057: stdout chunk (state=3): >>><<< 44071 1727204715.40059: done transferring module to remote 44071 1727204715.40061: _low_level_execute_command(): starting 44071 1727204715.40064: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204715.327125-51128-106531279022966/ /root/.ansible/tmp/ansible-tmp-1727204715.327125-51128-106531279022966/AnsiballZ_command.py && sleep 0' 44071 1727204715.40743: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204715.40747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204715.40751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204715.40841: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204715.40886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204715.40907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204715.40936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204715.41089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204715.42953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204715.42980: stdout chunk (state=3): >>><<< 44071 1727204715.43102: stderr chunk (state=3): >>><<< 44071 1727204715.43106: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204715.43110: _low_level_execute_command(): starting 44071 1727204715.43112: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204715.327125-51128-106531279022966/AnsiballZ_command.py && sleep 0' 44071 1727204715.43861: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204715.43892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204715.44020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204715.61031: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:05:15.605188", "end": "2024-09-24 15:05:15.608718", "delta": "0:00:00.003530", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204715.62711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204715.62716: stdout chunk (state=3): >>><<< 44071 1727204715.62718: stderr chunk (state=3): >>><<< 44071 1727204715.62876: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:05:15.605188", "end": "2024-09-24 15:05:15.608718", "delta": "0:00:00.003530", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204715.62881: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204715.327125-51128-106531279022966/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204715.62885: _low_level_execute_command(): starting 44071 1727204715.62888: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204715.327125-51128-106531279022966/ > /dev/null 2>&1 && sleep 0' 44071 1727204715.63492: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204715.63511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204715.63529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204715.63555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204715.63578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204715.63591: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204715.63606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204715.63626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204715.63726: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204715.63760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204715.63882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204715.65814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204715.65939: stderr chunk (state=3): >>><<< 44071 1727204715.65959: stdout chunk (state=3): >>><<< 44071 1727204715.65995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204715.66009: handler run complete 44071 1727204715.66049: Evaluated conditional (False): False 44071 1727204715.66093: attempt loop complete, returning result 44071 1727204715.66096: _execute() done 44071 1727204715.66104: dumping result to json 44071 1727204715.66106: done dumping result, returning 44071 1727204715.66116: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [127b8e07-fff9-c964-7471-000000002111] 44071 1727204715.66172: sending task result for task 127b8e07-fff9-c964-7471-000000002111 ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003530", "end": "2024-09-24 15:05:15.608718", "rc": 0, "start": "2024-09-24 15:05:15.605188" } STDOUT: bonding_masters eth0 lo 44071 1727204715.66578: no more pending results, returning what we have 44071 1727204715.66582: results queue empty 44071 1727204715.66583: checking for any_errors_fatal 44071 1727204715.66585: done checking for any_errors_fatal 44071 1727204715.66586: checking for max_fail_percentage 44071 1727204715.66588: done checking for max_fail_percentage 44071 1727204715.66589: checking to see if all hosts have failed and the running result is not ok 44071 1727204715.66590: done checking to see if all hosts have failed 44071 1727204715.66591: getting the remaining hosts for this loop 44071 1727204715.66593: done getting the remaining hosts for this loop 44071 1727204715.66599: getting the next task for host managed-node2 44071 1727204715.66608: done getting next task for host managed-node2 44071 1727204715.66610: ^ task is: TASK: Set current_interfaces 44071 1727204715.66618: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204715.66624: getting variables 44071 1727204715.66626: in VariableManager get_vars() 44071 1727204715.66785: Calling all_inventory to load vars for managed-node2 44071 1727204715.66789: Calling groups_inventory to load vars for managed-node2 44071 1727204715.66798: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204715.66807: done sending task result for task 127b8e07-fff9-c964-7471-000000002111 44071 1727204715.66810: WORKER PROCESS EXITING 44071 1727204715.66824: Calling all_plugins_play to load vars for managed-node2 44071 1727204715.66828: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204715.66831: Calling groups_plugins_play to load vars for managed-node2 44071 1727204715.69158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204715.70600: done with get_vars() 44071 1727204715.70633: done getting variables 44071 1727204715.70689: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:05:15 -0400 (0:00:00.424) 0:02:08.023 ***** 44071 1727204715.70716: entering _queue_task() for managed-node2/set_fact 44071 1727204715.71050: worker is 1 (out of 1 available) 44071 1727204715.71069: exiting _queue_task() for managed-node2/set_fact 44071 1727204715.71086: done queuing things up, now waiting for results queue to drain 44071 1727204715.71088: waiting for pending results... 44071 1727204715.71342: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 44071 1727204715.71472: in run() - task 127b8e07-fff9-c964-7471-000000002112 44071 1727204715.71486: variable 'ansible_search_path' from source: unknown 44071 1727204715.71497: variable 'ansible_search_path' from source: unknown 44071 1727204715.71528: calling self._execute() 44071 1727204715.71623: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.71629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.71642: variable 'omit' from source: magic vars 44071 1727204715.71966: variable 'ansible_distribution_major_version' from source: facts 44071 1727204715.71978: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204715.71985: variable 'omit' from source: magic vars 44071 1727204715.72024: variable 'omit' from source: magic vars 44071 1727204715.72114: variable '_current_interfaces' from source: set_fact 44071 1727204715.72371: variable 'omit' from source: magic vars 44071 1727204715.72375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204715.72383: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204715.72385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204715.72387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.72389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.72391: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204715.72393: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.72395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.72484: Set connection var ansible_connection to ssh 44071 1727204715.72497: Set connection var ansible_timeout to 10 44071 1727204715.72507: Set connection var ansible_pipelining to False 44071 1727204715.72516: Set connection var ansible_shell_type to sh 44071 1727204715.72524: Set connection var ansible_shell_executable to /bin/sh 44071 1727204715.72537: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204715.72568: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.72576: variable 'ansible_connection' from source: unknown 44071 1727204715.72582: variable 'ansible_module_compression' from source: unknown 44071 1727204715.72588: variable 'ansible_shell_type' from source: unknown 44071 1727204715.72594: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.72600: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.72607: variable 'ansible_pipelining' from source: unknown 44071 1727204715.72612: variable 'ansible_timeout' from source: unknown 44071 1727204715.72619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.72774: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204715.72784: variable 'omit' from source: magic vars 44071 1727204715.72790: starting attempt loop 44071 1727204715.72793: running the handler 44071 1727204715.72808: handler run complete 44071 1727204715.72818: attempt loop complete, returning result 44071 1727204715.72820: _execute() done 44071 1727204715.72823: dumping result to json 44071 1727204715.72826: done dumping result, returning 44071 1727204715.72834: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [127b8e07-fff9-c964-7471-000000002112] 44071 1727204715.72841: sending task result for task 127b8e07-fff9-c964-7471-000000002112 44071 1727204715.72941: done sending task result for task 127b8e07-fff9-c964-7471-000000002112 44071 1727204715.72944: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 44071 1727204715.73008: no more pending results, returning what we have 44071 1727204715.73012: results queue empty 44071 1727204715.73013: checking for any_errors_fatal 44071 1727204715.73025: done checking for any_errors_fatal 44071 1727204715.73025: checking for max_fail_percentage 44071 1727204715.73027: done checking for max_fail_percentage 44071 1727204715.73028: checking to see if all hosts have failed and the running result is not ok 44071 1727204715.73029: done checking to see if all hosts have failed 44071 1727204715.73029: getting the remaining hosts for this loop 44071 1727204715.73031: done getting the remaining hosts for this loop 44071 1727204715.73036: getting the next task for host managed-node2 44071 1727204715.73045: done getting next task for host managed-node2 44071 1727204715.73048: ^ task is: TASK: Show current_interfaces 44071 1727204715.73057: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204715.73061: getting variables 44071 1727204715.73062: in VariableManager get_vars() 44071 1727204715.73108: Calling all_inventory to load vars for managed-node2 44071 1727204715.73111: Calling groups_inventory to load vars for managed-node2 44071 1727204715.73114: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204715.73126: Calling all_plugins_play to load vars for managed-node2 44071 1727204715.73129: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204715.73132: Calling groups_plugins_play to load vars for managed-node2 44071 1727204715.74208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204715.75538: done with get_vars() 44071 1727204715.75568: done getting variables 44071 1727204715.75621: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:05:15 -0400 (0:00:00.049) 0:02:08.072 ***** 44071 1727204715.75648: entering _queue_task() for managed-node2/debug 44071 1727204715.75950: worker is 1 (out of 1 available) 44071 1727204715.75967: exiting _queue_task() for managed-node2/debug 44071 1727204715.75982: done queuing things up, now waiting for results queue to drain 44071 1727204715.75985: waiting for pending results... 44071 1727204715.76198: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 44071 1727204715.76290: in run() - task 127b8e07-fff9-c964-7471-0000000020d7 44071 1727204715.76305: variable 'ansible_search_path' from source: unknown 44071 1727204715.76308: variable 'ansible_search_path' from source: unknown 44071 1727204715.76346: calling self._execute() 44071 1727204715.76429: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.76442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.76452: variable 'omit' from source: magic vars 44071 1727204715.76767: variable 'ansible_distribution_major_version' from source: facts 44071 1727204715.76779: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204715.76785: variable 'omit' from source: magic vars 44071 1727204715.76821: variable 'omit' from source: magic vars 44071 1727204715.76898: variable 'current_interfaces' from source: set_fact 44071 1727204715.76921: variable 'omit' from source: magic vars 44071 1727204715.76959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204715.76993: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204715.77012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204715.77027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.77039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204715.77075: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204715.77083: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.77087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.77169: Set connection var ansible_connection to ssh 44071 1727204715.77173: Set connection var ansible_timeout to 10 44071 1727204715.77180: Set connection var ansible_pipelining to False 44071 1727204715.77185: Set connection var ansible_shell_type to sh 44071 1727204715.77191: Set connection var ansible_shell_executable to /bin/sh 44071 1727204715.77204: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204715.77221: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.77225: variable 'ansible_connection' from source: unknown 44071 1727204715.77227: variable 'ansible_module_compression' from source: unknown 44071 1727204715.77230: variable 'ansible_shell_type' from source: unknown 44071 1727204715.77233: variable 'ansible_shell_executable' from source: unknown 44071 1727204715.77235: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.77241: variable 'ansible_pipelining' from source: unknown 44071 1727204715.77244: variable 'ansible_timeout' from source: unknown 44071 1727204715.77248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.77380: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204715.77389: variable 'omit' from source: magic vars 44071 1727204715.77394: starting attempt loop 44071 1727204715.77397: running the handler 44071 1727204715.77443: handler run complete 44071 1727204715.77456: attempt loop complete, returning result 44071 1727204715.77459: _execute() done 44071 1727204715.77462: dumping result to json 44071 1727204715.77464: done dumping result, returning 44071 1727204715.77474: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [127b8e07-fff9-c964-7471-0000000020d7] 44071 1727204715.77479: sending task result for task 127b8e07-fff9-c964-7471-0000000020d7 44071 1727204715.77578: done sending task result for task 127b8e07-fff9-c964-7471-0000000020d7 44071 1727204715.77581: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 44071 1727204715.77637: no more pending results, returning what we have 44071 1727204715.77640: results queue empty 44071 1727204715.77641: checking for any_errors_fatal 44071 1727204715.77649: done checking for any_errors_fatal 44071 1727204715.77650: checking for max_fail_percentage 44071 1727204715.77657: done checking for max_fail_percentage 44071 1727204715.77658: checking to see if all hosts have failed and the running result is not ok 44071 1727204715.77659: done checking to see if all hosts have failed 44071 1727204715.77660: getting the remaining hosts for this loop 44071 1727204715.77661: done getting the remaining hosts for this loop 44071 1727204715.77669: getting the next task for host managed-node2 44071 1727204715.77679: done getting next task for host managed-node2 44071 1727204715.77682: ^ task is: TASK: Setup 44071 1727204715.77685: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204715.77691: getting variables 44071 1727204715.77693: in VariableManager get_vars() 44071 1727204715.77739: Calling all_inventory to load vars for managed-node2 44071 1727204715.77742: Calling groups_inventory to load vars for managed-node2 44071 1727204715.77745: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204715.77758: Calling all_plugins_play to load vars for managed-node2 44071 1727204715.77760: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204715.77763: Calling groups_plugins_play to load vars for managed-node2 44071 1727204715.78864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204715.81565: done with get_vars() 44071 1727204715.81609: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 15:05:15 -0400 (0:00:00.060) 0:02:08.133 ***** 44071 1727204715.81693: entering _queue_task() for managed-node2/include_tasks 44071 1727204715.81999: worker is 1 (out of 1 available) 44071 1727204715.82014: exiting _queue_task() for managed-node2/include_tasks 44071 1727204715.82031: done queuing things up, now waiting for results queue to drain 44071 1727204715.82032: waiting for pending results... 44071 1727204715.82241: running TaskExecutor() for managed-node2/TASK: Setup 44071 1727204715.82330: in run() - task 127b8e07-fff9-c964-7471-0000000020b0 44071 1727204715.82346: variable 'ansible_search_path' from source: unknown 44071 1727204715.82350: variable 'ansible_search_path' from source: unknown 44071 1727204715.82393: variable 'lsr_setup' from source: include params 44071 1727204715.82575: variable 'lsr_setup' from source: include params 44071 1727204715.82635: variable 'omit' from source: magic vars 44071 1727204715.82755: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.82764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.82775: variable 'omit' from source: magic vars 44071 1727204715.82980: variable 'ansible_distribution_major_version' from source: facts 44071 1727204715.82989: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204715.82996: variable 'item' from source: unknown 44071 1727204715.83051: variable 'item' from source: unknown 44071 1727204715.83079: variable 'item' from source: unknown 44071 1727204715.83134: variable 'item' from source: unknown 44071 1727204715.83414: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.83418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.83871: variable 'omit' from source: magic vars 44071 1727204715.83876: variable 'ansible_distribution_major_version' from source: facts 44071 1727204715.83879: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204715.83881: variable 'item' from source: unknown 44071 1727204715.83887: variable 'item' from source: unknown 44071 1727204715.83889: variable 'item' from source: unknown 44071 1727204715.83918: variable 'item' from source: unknown 44071 1727204715.84063: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204715.84078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204715.84092: variable 'omit' from source: magic vars 44071 1727204715.84259: variable 'ansible_distribution_major_version' from source: facts 44071 1727204715.84272: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204715.84281: variable 'item' from source: unknown 44071 1727204715.84349: variable 'item' from source: unknown 44071 1727204715.84392: variable 'item' from source: unknown 44071 1727204715.84459: variable 'item' from source: unknown 44071 1727204715.84670: dumping result to json 44071 1727204715.84674: done dumping result, returning 44071 1727204715.84677: done running TaskExecutor() for managed-node2/TASK: Setup [127b8e07-fff9-c964-7471-0000000020b0] 44071 1727204715.84680: sending task result for task 127b8e07-fff9-c964-7471-0000000020b0 44071 1727204715.84728: done sending task result for task 127b8e07-fff9-c964-7471-0000000020b0 44071 1727204715.84731: WORKER PROCESS EXITING 44071 1727204715.84797: no more pending results, returning what we have 44071 1727204715.84802: in VariableManager get_vars() 44071 1727204715.84855: Calling all_inventory to load vars for managed-node2 44071 1727204715.84858: Calling groups_inventory to load vars for managed-node2 44071 1727204715.84862: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204715.84879: Calling all_plugins_play to load vars for managed-node2 44071 1727204715.84882: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204715.84884: Calling groups_plugins_play to load vars for managed-node2 44071 1727204715.94113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204715.96357: done with get_vars() 44071 1727204715.96397: variable 'ansible_search_path' from source: unknown 44071 1727204715.96398: variable 'ansible_search_path' from source: unknown 44071 1727204715.96444: variable 'ansible_search_path' from source: unknown 44071 1727204715.96445: variable 'ansible_search_path' from source: unknown 44071 1727204715.96478: variable 'ansible_search_path' from source: unknown 44071 1727204715.96479: variable 'ansible_search_path' from source: unknown 44071 1727204715.96512: we have included files to process 44071 1727204715.96513: generating all_blocks data 44071 1727204715.96514: done generating all_blocks data 44071 1727204715.96518: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 44071 1727204715.96519: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 44071 1727204715.96521: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 44071 1727204715.96788: done processing included file 44071 1727204715.96791: iterating over new_blocks loaded from include file 44071 1727204715.96792: in VariableManager get_vars() 44071 1727204715.96813: done with get_vars() 44071 1727204715.96814: filtering new block on tags 44071 1727204715.96857: done filtering new block on tags 44071 1727204715.96860: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed-node2 => (item=tasks/create_bridge_profile.yml) 44071 1727204715.96867: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 44071 1727204715.96868: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 44071 1727204715.96872: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 44071 1727204715.96980: done processing included file 44071 1727204715.96982: iterating over new_blocks loaded from include file 44071 1727204715.96983: in VariableManager get_vars() 44071 1727204715.97003: done with get_vars() 44071 1727204715.97005: filtering new block on tags 44071 1727204715.97029: done filtering new block on tags 44071 1727204715.97031: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed-node2 => (item=tasks/activate_profile.yml) 44071 1727204715.97039: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 44071 1727204715.97040: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 44071 1727204715.97043: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 44071 1727204715.97280: done processing included file 44071 1727204715.97282: iterating over new_blocks loaded from include file 44071 1727204715.97284: in VariableManager get_vars() 44071 1727204715.97301: done with get_vars() 44071 1727204715.97303: filtering new block on tags 44071 1727204715.97329: done filtering new block on tags 44071 1727204715.97331: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed-node2 => (item=tasks/remove+down_profile.yml) 44071 1727204715.97339: extending task lists for all hosts with included blocks 44071 1727204715.98614: done extending task lists 44071 1727204715.98616: done processing included files 44071 1727204715.98617: results queue empty 44071 1727204715.98617: checking for any_errors_fatal 44071 1727204715.98622: done checking for any_errors_fatal 44071 1727204715.98623: checking for max_fail_percentage 44071 1727204715.98624: done checking for max_fail_percentage 44071 1727204715.98625: checking to see if all hosts have failed and the running result is not ok 44071 1727204715.98626: done checking to see if all hosts have failed 44071 1727204715.98627: getting the remaining hosts for this loop 44071 1727204715.98628: done getting the remaining hosts for this loop 44071 1727204715.98631: getting the next task for host managed-node2 44071 1727204715.98638: done getting next task for host managed-node2 44071 1727204715.98641: ^ task is: TASK: Include network role 44071 1727204715.98644: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204715.98647: getting variables 44071 1727204715.98648: in VariableManager get_vars() 44071 1727204715.98664: Calling all_inventory to load vars for managed-node2 44071 1727204715.98669: Calling groups_inventory to load vars for managed-node2 44071 1727204715.98672: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204715.98678: Calling all_plugins_play to load vars for managed-node2 44071 1727204715.98681: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204715.98684: Calling groups_plugins_play to load vars for managed-node2 44071 1727204716.01832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204716.04831: done with get_vars() 44071 1727204716.04878: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Tuesday 24 September 2024 15:05:16 -0400 (0:00:00.232) 0:02:08.366 ***** 44071 1727204716.05175: entering _queue_task() for managed-node2/include_role 44071 1727204716.05832: worker is 1 (out of 1 available) 44071 1727204716.05850: exiting _queue_task() for managed-node2/include_role 44071 1727204716.05869: done queuing things up, now waiting for results queue to drain 44071 1727204716.05871: waiting for pending results... 44071 1727204716.06204: running TaskExecutor() for managed-node2/TASK: Include network role 44071 1727204716.06363: in run() - task 127b8e07-fff9-c964-7471-000000002139 44071 1727204716.06472: variable 'ansible_search_path' from source: unknown 44071 1727204716.06475: variable 'ansible_search_path' from source: unknown 44071 1727204716.06479: calling self._execute() 44071 1727204716.06557: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204716.06572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204716.06587: variable 'omit' from source: magic vars 44071 1727204716.07008: variable 'ansible_distribution_major_version' from source: facts 44071 1727204716.07027: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204716.07040: _execute() done 44071 1727204716.07050: dumping result to json 44071 1727204716.07058: done dumping result, returning 44071 1727204716.07070: done running TaskExecutor() for managed-node2/TASK: Include network role [127b8e07-fff9-c964-7471-000000002139] 44071 1727204716.07081: sending task result for task 127b8e07-fff9-c964-7471-000000002139 44071 1727204716.07401: no more pending results, returning what we have 44071 1727204716.07408: in VariableManager get_vars() 44071 1727204716.07472: Calling all_inventory to load vars for managed-node2 44071 1727204716.07476: Calling groups_inventory to load vars for managed-node2 44071 1727204716.07480: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204716.07497: Calling all_plugins_play to load vars for managed-node2 44071 1727204716.07500: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204716.07504: Calling groups_plugins_play to load vars for managed-node2 44071 1727204716.08086: done sending task result for task 127b8e07-fff9-c964-7471-000000002139 44071 1727204716.08090: WORKER PROCESS EXITING 44071 1727204716.09637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204716.11998: done with get_vars() 44071 1727204716.12039: variable 'ansible_search_path' from source: unknown 44071 1727204716.12041: variable 'ansible_search_path' from source: unknown 44071 1727204716.12286: variable 'omit' from source: magic vars 44071 1727204716.12337: variable 'omit' from source: magic vars 44071 1727204716.12356: variable 'omit' from source: magic vars 44071 1727204716.12362: we have included files to process 44071 1727204716.12362: generating all_blocks data 44071 1727204716.12365: done generating all_blocks data 44071 1727204716.12368: processing included file: fedora.linux_system_roles.network 44071 1727204716.12397: in VariableManager get_vars() 44071 1727204716.12419: done with get_vars() 44071 1727204716.12454: in VariableManager get_vars() 44071 1727204716.12478: done with get_vars() 44071 1727204716.12526: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44071 1727204716.12679: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44071 1727204716.12782: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44071 1727204716.13390: in VariableManager get_vars() 44071 1727204716.13418: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204716.15830: iterating over new_blocks loaded from include file 44071 1727204716.15836: in VariableManager get_vars() 44071 1727204716.15863: done with get_vars() 44071 1727204716.15867: filtering new block on tags 44071 1727204716.16342: done filtering new block on tags 44071 1727204716.16347: in VariableManager get_vars() 44071 1727204716.16372: done with get_vars() 44071 1727204716.16374: filtering new block on tags 44071 1727204716.16396: done filtering new block on tags 44071 1727204716.16398: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 44071 1727204716.16404: extending task lists for all hosts with included blocks 44071 1727204716.16602: done extending task lists 44071 1727204716.16603: done processing included files 44071 1727204716.16604: results queue empty 44071 1727204716.16605: checking for any_errors_fatal 44071 1727204716.16610: done checking for any_errors_fatal 44071 1727204716.16611: checking for max_fail_percentage 44071 1727204716.16612: done checking for max_fail_percentage 44071 1727204716.16613: checking to see if all hosts have failed and the running result is not ok 44071 1727204716.16613: done checking to see if all hosts have failed 44071 1727204716.16614: getting the remaining hosts for this loop 44071 1727204716.16616: done getting the remaining hosts for this loop 44071 1727204716.16619: getting the next task for host managed-node2 44071 1727204716.16624: done getting next task for host managed-node2 44071 1727204716.16627: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204716.16631: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204716.16651: getting variables 44071 1727204716.16652: in VariableManager get_vars() 44071 1727204716.16672: Calling all_inventory to load vars for managed-node2 44071 1727204716.16675: Calling groups_inventory to load vars for managed-node2 44071 1727204716.16677: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204716.16684: Calling all_plugins_play to load vars for managed-node2 44071 1727204716.16686: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204716.16689: Calling groups_plugins_play to load vars for managed-node2 44071 1727204716.18304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204716.20520: done with get_vars() 44071 1727204716.20719: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:05:16 -0400 (0:00:00.157) 0:02:08.525 ***** 44071 1727204716.20945: entering _queue_task() for managed-node2/include_tasks 44071 1727204716.21917: worker is 1 (out of 1 available) 44071 1727204716.21937: exiting _queue_task() for managed-node2/include_tasks 44071 1727204716.21952: done queuing things up, now waiting for results queue to drain 44071 1727204716.22172: waiting for pending results... 44071 1727204716.22527: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204716.22735: in run() - task 127b8e07-fff9-c964-7471-0000000021a3 44071 1727204716.22775: variable 'ansible_search_path' from source: unknown 44071 1727204716.22784: variable 'ansible_search_path' from source: unknown 44071 1727204716.22838: calling self._execute() 44071 1727204716.23063: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204716.23070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204716.23087: variable 'omit' from source: magic vars 44071 1727204716.23753: variable 'ansible_distribution_major_version' from source: facts 44071 1727204716.23779: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204716.23792: _execute() done 44071 1727204716.23804: dumping result to json 44071 1727204716.23815: done dumping result, returning 44071 1727204716.23873: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-c964-7471-0000000021a3] 44071 1727204716.23877: sending task result for task 127b8e07-fff9-c964-7471-0000000021a3 44071 1727204716.24219: done sending task result for task 127b8e07-fff9-c964-7471-0000000021a3 44071 1727204716.24222: WORKER PROCESS EXITING 44071 1727204716.24280: no more pending results, returning what we have 44071 1727204716.24286: in VariableManager get_vars() 44071 1727204716.24348: Calling all_inventory to load vars for managed-node2 44071 1727204716.24352: Calling groups_inventory to load vars for managed-node2 44071 1727204716.24355: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204716.24371: Calling all_plugins_play to load vars for managed-node2 44071 1727204716.24375: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204716.24379: Calling groups_plugins_play to load vars for managed-node2 44071 1727204716.27355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204716.31125: done with get_vars() 44071 1727204716.31169: variable 'ansible_search_path' from source: unknown 44071 1727204716.31171: variable 'ansible_search_path' from source: unknown 44071 1727204716.31227: we have included files to process 44071 1727204716.31228: generating all_blocks data 44071 1727204716.31230: done generating all_blocks data 44071 1727204716.31234: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204716.31236: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204716.31238: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204716.31925: done processing included file 44071 1727204716.31927: iterating over new_blocks loaded from include file 44071 1727204716.31929: in VariableManager get_vars() 44071 1727204716.31961: done with get_vars() 44071 1727204716.31963: filtering new block on tags 44071 1727204716.32007: done filtering new block on tags 44071 1727204716.32011: in VariableManager get_vars() 44071 1727204716.32042: done with get_vars() 44071 1727204716.32044: filtering new block on tags 44071 1727204716.32147: done filtering new block on tags 44071 1727204716.32151: in VariableManager get_vars() 44071 1727204716.32192: done with get_vars() 44071 1727204716.32198: filtering new block on tags 44071 1727204716.32250: done filtering new block on tags 44071 1727204716.32253: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 44071 1727204716.32259: extending task lists for all hosts with included blocks 44071 1727204716.34841: done extending task lists 44071 1727204716.34844: done processing included files 44071 1727204716.34845: results queue empty 44071 1727204716.34846: checking for any_errors_fatal 44071 1727204716.34851: done checking for any_errors_fatal 44071 1727204716.34851: checking for max_fail_percentage 44071 1727204716.34853: done checking for max_fail_percentage 44071 1727204716.34854: checking to see if all hosts have failed and the running result is not ok 44071 1727204716.34855: done checking to see if all hosts have failed 44071 1727204716.34856: getting the remaining hosts for this loop 44071 1727204716.34857: done getting the remaining hosts for this loop 44071 1727204716.34860: getting the next task for host managed-node2 44071 1727204716.34870: done getting next task for host managed-node2 44071 1727204716.34873: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204716.34879: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204716.34893: getting variables 44071 1727204716.34894: in VariableManager get_vars() 44071 1727204716.34922: Calling all_inventory to load vars for managed-node2 44071 1727204716.34925: Calling groups_inventory to load vars for managed-node2 44071 1727204716.34927: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204716.34934: Calling all_plugins_play to load vars for managed-node2 44071 1727204716.34936: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204716.34939: Calling groups_plugins_play to load vars for managed-node2 44071 1727204716.36784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204716.39755: done with get_vars() 44071 1727204716.39809: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:05:16 -0400 (0:00:00.189) 0:02:08.715 ***** 44071 1727204716.39919: entering _queue_task() for managed-node2/setup 44071 1727204716.40508: worker is 1 (out of 1 available) 44071 1727204716.40521: exiting _queue_task() for managed-node2/setup 44071 1727204716.40537: done queuing things up, now waiting for results queue to drain 44071 1727204716.40539: waiting for pending results... 44071 1727204716.40782: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204716.40993: in run() - task 127b8e07-fff9-c964-7471-000000002200 44071 1727204716.41013: variable 'ansible_search_path' from source: unknown 44071 1727204716.41017: variable 'ansible_search_path' from source: unknown 44071 1727204716.41062: calling self._execute() 44071 1727204716.41181: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204716.41220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204716.41250: variable 'omit' from source: magic vars 44071 1727204716.42189: variable 'ansible_distribution_major_version' from source: facts 44071 1727204716.42204: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204716.42883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204716.45891: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204716.45980: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204716.46031: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204716.46074: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204716.46276: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204716.46280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204716.46283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204716.46286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204716.46330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204716.46354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204716.46413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204716.46451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204716.46481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204716.46524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204716.46547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204716.46744: variable '__network_required_facts' from source: role '' defaults 44071 1727204716.46760: variable 'ansible_facts' from source: unknown 44071 1727204716.48381: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44071 1727204716.48386: when evaluation is False, skipping this task 44071 1727204716.48389: _execute() done 44071 1727204716.48392: dumping result to json 44071 1727204716.48394: done dumping result, returning 44071 1727204716.48402: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-c964-7471-000000002200] 44071 1727204716.48476: sending task result for task 127b8e07-fff9-c964-7471-000000002200 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204716.48647: no more pending results, returning what we have 44071 1727204716.48652: results queue empty 44071 1727204716.48654: checking for any_errors_fatal 44071 1727204716.48657: done checking for any_errors_fatal 44071 1727204716.48658: checking for max_fail_percentage 44071 1727204716.48660: done checking for max_fail_percentage 44071 1727204716.48661: checking to see if all hosts have failed and the running result is not ok 44071 1727204716.48662: done checking to see if all hosts have failed 44071 1727204716.48663: getting the remaining hosts for this loop 44071 1727204716.48664: done getting the remaining hosts for this loop 44071 1727204716.48677: getting the next task for host managed-node2 44071 1727204716.48691: done getting next task for host managed-node2 44071 1727204716.48696: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204716.48703: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204716.48729: getting variables 44071 1727204716.48732: in VariableManager get_vars() 44071 1727204716.48919: Calling all_inventory to load vars for managed-node2 44071 1727204716.48923: Calling groups_inventory to load vars for managed-node2 44071 1727204716.48925: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204716.48943: Calling all_plugins_play to load vars for managed-node2 44071 1727204716.48947: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204716.48952: Calling groups_plugins_play to load vars for managed-node2 44071 1727204716.49493: done sending task result for task 127b8e07-fff9-c964-7471-000000002200 44071 1727204716.49505: WORKER PROCESS EXITING 44071 1727204716.51564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204716.54041: done with get_vars() 44071 1727204716.54088: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:05:16 -0400 (0:00:00.142) 0:02:08.858 ***** 44071 1727204716.54220: entering _queue_task() for managed-node2/stat 44071 1727204716.54775: worker is 1 (out of 1 available) 44071 1727204716.54789: exiting _queue_task() for managed-node2/stat 44071 1727204716.54802: done queuing things up, now waiting for results queue to drain 44071 1727204716.54804: waiting for pending results... 44071 1727204716.55074: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204716.55254: in run() - task 127b8e07-fff9-c964-7471-000000002202 44071 1727204716.55275: variable 'ansible_search_path' from source: unknown 44071 1727204716.55281: variable 'ansible_search_path' from source: unknown 44071 1727204716.55326: calling self._execute() 44071 1727204716.55453: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204716.55460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204716.55472: variable 'omit' from source: magic vars 44071 1727204716.55945: variable 'ansible_distribution_major_version' from source: facts 44071 1727204716.55963: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204716.56171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204716.56497: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204716.56601: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204716.56614: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204716.56655: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204716.56755: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204716.56784: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204716.56816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204716.56927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204716.56960: variable '__network_is_ostree' from source: set_fact 44071 1727204716.56972: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204716.56975: when evaluation is False, skipping this task 44071 1727204716.56998: _execute() done 44071 1727204716.57002: dumping result to json 44071 1727204716.57004: done dumping result, returning 44071 1727204716.57014: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-c964-7471-000000002202] 44071 1727204716.57019: sending task result for task 127b8e07-fff9-c964-7471-000000002202 44071 1727204716.57133: done sending task result for task 127b8e07-fff9-c964-7471-000000002202 44071 1727204716.57138: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204716.57328: no more pending results, returning what we have 44071 1727204716.57332: results queue empty 44071 1727204716.57336: checking for any_errors_fatal 44071 1727204716.57342: done checking for any_errors_fatal 44071 1727204716.57343: checking for max_fail_percentage 44071 1727204716.57345: done checking for max_fail_percentage 44071 1727204716.57346: checking to see if all hosts have failed and the running result is not ok 44071 1727204716.57347: done checking to see if all hosts have failed 44071 1727204716.57348: getting the remaining hosts for this loop 44071 1727204716.57349: done getting the remaining hosts for this loop 44071 1727204716.57354: getting the next task for host managed-node2 44071 1727204716.57363: done getting next task for host managed-node2 44071 1727204716.57369: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204716.57375: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204716.57396: getting variables 44071 1727204716.57398: in VariableManager get_vars() 44071 1727204716.57451: Calling all_inventory to load vars for managed-node2 44071 1727204716.57454: Calling groups_inventory to load vars for managed-node2 44071 1727204716.57456: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204716.57487: Calling all_plugins_play to load vars for managed-node2 44071 1727204716.57492: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204716.57496: Calling groups_plugins_play to load vars for managed-node2 44071 1727204716.60586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204716.65368: done with get_vars() 44071 1727204716.65413: done getting variables 44071 1727204716.65484: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:05:16 -0400 (0:00:00.113) 0:02:08.971 ***** 44071 1727204716.65528: entering _queue_task() for managed-node2/set_fact 44071 1727204716.65956: worker is 1 (out of 1 available) 44071 1727204716.65976: exiting _queue_task() for managed-node2/set_fact 44071 1727204716.65995: done queuing things up, now waiting for results queue to drain 44071 1727204716.65997: waiting for pending results... 44071 1727204716.66488: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204716.66494: in run() - task 127b8e07-fff9-c964-7471-000000002203 44071 1727204716.66509: variable 'ansible_search_path' from source: unknown 44071 1727204716.66517: variable 'ansible_search_path' from source: unknown 44071 1727204716.66570: calling self._execute() 44071 1727204716.66726: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204716.66744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204716.66764: variable 'omit' from source: magic vars 44071 1727204716.67212: variable 'ansible_distribution_major_version' from source: facts 44071 1727204716.67240: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204716.67441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204716.67792: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204716.67856: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204716.68004: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204716.68008: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204716.68074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204716.68108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204716.68148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204716.68187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204716.68309: variable '__network_is_ostree' from source: set_fact 44071 1727204716.68324: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204716.68342: when evaluation is False, skipping this task 44071 1727204716.68351: _execute() done 44071 1727204716.68359: dumping result to json 44071 1727204716.68369: done dumping result, returning 44071 1727204716.68382: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-000000002203] 44071 1727204716.68443: sending task result for task 127b8e07-fff9-c964-7471-000000002203 44071 1727204716.68535: done sending task result for task 127b8e07-fff9-c964-7471-000000002203 44071 1727204716.68539: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204716.68599: no more pending results, returning what we have 44071 1727204716.68603: results queue empty 44071 1727204716.68604: checking for any_errors_fatal 44071 1727204716.68615: done checking for any_errors_fatal 44071 1727204716.68615: checking for max_fail_percentage 44071 1727204716.68617: done checking for max_fail_percentage 44071 1727204716.68618: checking to see if all hosts have failed and the running result is not ok 44071 1727204716.68619: done checking to see if all hosts have failed 44071 1727204716.68620: getting the remaining hosts for this loop 44071 1727204716.68621: done getting the remaining hosts for this loop 44071 1727204716.68626: getting the next task for host managed-node2 44071 1727204716.68638: done getting next task for host managed-node2 44071 1727204716.68642: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204716.68649: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204716.68873: getting variables 44071 1727204716.68875: in VariableManager get_vars() 44071 1727204716.68921: Calling all_inventory to load vars for managed-node2 44071 1727204716.68924: Calling groups_inventory to load vars for managed-node2 44071 1727204716.68927: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204716.68937: Calling all_plugins_play to load vars for managed-node2 44071 1727204716.68940: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204716.68943: Calling groups_plugins_play to load vars for managed-node2 44071 1727204716.72422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204716.76054: done with get_vars() 44071 1727204716.76105: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:05:16 -0400 (0:00:00.106) 0:02:09.078 ***** 44071 1727204716.76225: entering _queue_task() for managed-node2/service_facts 44071 1727204716.76763: worker is 1 (out of 1 available) 44071 1727204716.76882: exiting _queue_task() for managed-node2/service_facts 44071 1727204716.76896: done queuing things up, now waiting for results queue to drain 44071 1727204716.76898: waiting for pending results... 44071 1727204716.77510: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204716.77517: in run() - task 127b8e07-fff9-c964-7471-000000002205 44071 1727204716.77520: variable 'ansible_search_path' from source: unknown 44071 1727204716.77523: variable 'ansible_search_path' from source: unknown 44071 1727204716.77773: calling self._execute() 44071 1727204716.77778: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204716.77781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204716.77785: variable 'omit' from source: magic vars 44071 1727204716.78561: variable 'ansible_distribution_major_version' from source: facts 44071 1727204716.78569: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204716.78572: variable 'omit' from source: magic vars 44071 1727204716.78575: variable 'omit' from source: magic vars 44071 1727204716.78673: variable 'omit' from source: magic vars 44071 1727204716.78678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204716.78736: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204716.78741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204716.78744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204716.78746: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204716.78872: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204716.78877: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204716.78880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204716.79173: Set connection var ansible_connection to ssh 44071 1727204716.79176: Set connection var ansible_timeout to 10 44071 1727204716.79179: Set connection var ansible_pipelining to False 44071 1727204716.79182: Set connection var ansible_shell_type to sh 44071 1727204716.79184: Set connection var ansible_shell_executable to /bin/sh 44071 1727204716.79187: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204716.79189: variable 'ansible_shell_executable' from source: unknown 44071 1727204716.79191: variable 'ansible_connection' from source: unknown 44071 1727204716.79194: variable 'ansible_module_compression' from source: unknown 44071 1727204716.79196: variable 'ansible_shell_type' from source: unknown 44071 1727204716.79198: variable 'ansible_shell_executable' from source: unknown 44071 1727204716.79200: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204716.79201: variable 'ansible_pipelining' from source: unknown 44071 1727204716.79203: variable 'ansible_timeout' from source: unknown 44071 1727204716.79205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204716.79734: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204716.79739: variable 'omit' from source: magic vars 44071 1727204716.79742: starting attempt loop 44071 1727204716.79745: running the handler 44071 1727204716.79748: _low_level_execute_command(): starting 44071 1727204716.79750: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204716.81091: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204716.81115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204716.81294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204716.81484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204716.83291: stdout chunk (state=3): >>>/root <<< 44071 1727204716.83397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204716.83401: stdout chunk (state=3): >>><<< 44071 1727204716.83412: stderr chunk (state=3): >>><<< 44071 1727204716.83559: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204716.83581: _low_level_execute_command(): starting 44071 1727204716.83590: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204716.8356197-51283-140015010238316 `" && echo ansible-tmp-1727204716.8356197-51283-140015010238316="` echo /root/.ansible/tmp/ansible-tmp-1727204716.8356197-51283-140015010238316 `" ) && sleep 0' 44071 1727204716.85075: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204716.85081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204716.85163: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204716.85178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204716.85204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204716.85313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204716.87320: stdout chunk (state=3): >>>ansible-tmp-1727204716.8356197-51283-140015010238316=/root/.ansible/tmp/ansible-tmp-1727204716.8356197-51283-140015010238316 <<< 44071 1727204716.87500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204716.87526: stderr chunk (state=3): >>><<< 44071 1727204716.87530: stdout chunk (state=3): >>><<< 44071 1727204716.87554: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204716.8356197-51283-140015010238316=/root/.ansible/tmp/ansible-tmp-1727204716.8356197-51283-140015010238316 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204716.87653: variable 'ansible_module_compression' from source: unknown 44071 1727204716.87706: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44071 1727204716.87911: variable 'ansible_facts' from source: unknown 44071 1727204716.88077: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204716.8356197-51283-140015010238316/AnsiballZ_service_facts.py 44071 1727204716.88472: Sending initial data 44071 1727204716.88476: Sent initial data (162 bytes) 44071 1727204716.90084: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204716.90150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204716.90177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204716.90255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204716.91882: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204716.91939: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204716.92019: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbwbikomu /root/.ansible/tmp/ansible-tmp-1727204716.8356197-51283-140015010238316/AnsiballZ_service_facts.py <<< 44071 1727204716.92023: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204716.8356197-51283-140015010238316/AnsiballZ_service_facts.py" <<< 44071 1727204716.92110: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbwbikomu" to remote "/root/.ansible/tmp/ansible-tmp-1727204716.8356197-51283-140015010238316/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204716.8356197-51283-140015010238316/AnsiballZ_service_facts.py" <<< 44071 1727204716.93844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204716.93852: stderr chunk (state=3): >>><<< 44071 1727204716.93855: stdout chunk (state=3): >>><<< 44071 1727204716.93883: done transferring module to remote 44071 1727204716.93895: _low_level_execute_command(): starting 44071 1727204716.93901: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204716.8356197-51283-140015010238316/ /root/.ansible/tmp/ansible-tmp-1727204716.8356197-51283-140015010238316/AnsiballZ_service_facts.py && sleep 0' 44071 1727204716.95345: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204716.95352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204716.95355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204716.95358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204716.95361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204716.95415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204716.95474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204716.97403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204716.97488: stderr chunk (state=3): >>><<< 44071 1727204716.97492: stdout chunk (state=3): >>><<< 44071 1727204716.97510: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204716.97514: _low_level_execute_command(): starting 44071 1727204716.97519: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204716.8356197-51283-140015010238316/AnsiballZ_service_facts.py && sleep 0' 44071 1727204716.98863: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204716.98873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204716.98877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204716.98879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204716.98918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204716.98986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204716.99103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204719.21591: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root<<< 44071 1727204719.21636: stdout chunk (state=3): >>>.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44071 1727204719.23348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204719.23373: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 44071 1727204719.23673: stdout chunk (state=3): >>><<< 44071 1727204719.23677: stderr chunk (state=3): >>><<< 44071 1727204719.23682: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204719.26075: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204716.8356197-51283-140015010238316/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204719.26080: _low_level_execute_command(): starting 44071 1727204719.26083: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204716.8356197-51283-140015010238316/ > /dev/null 2>&1 && sleep 0' 44071 1727204719.27574: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204719.27578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204719.27581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204719.27583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204719.27625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204719.27702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204719.27727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204719.28059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204719.29946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204719.29951: stdout chunk (state=3): >>><<< 44071 1727204719.29953: stderr chunk (state=3): >>><<< 44071 1727204719.29975: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204719.29981: handler run complete 44071 1727204719.30243: variable 'ansible_facts' from source: unknown 44071 1727204719.30452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204719.31877: variable 'ansible_facts' from source: unknown 44071 1727204719.32224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204719.32657: attempt loop complete, returning result 44071 1727204719.32664: _execute() done 44071 1727204719.32669: dumping result to json 44071 1727204719.32748: done dumping result, returning 44071 1727204719.32759: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-c964-7471-000000002205] 44071 1727204719.32762: sending task result for task 127b8e07-fff9-c964-7471-000000002205 44071 1727204719.34593: done sending task result for task 127b8e07-fff9-c964-7471-000000002205 44071 1727204719.34597: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204719.34770: no more pending results, returning what we have 44071 1727204719.34774: results queue empty 44071 1727204719.34775: checking for any_errors_fatal 44071 1727204719.34780: done checking for any_errors_fatal 44071 1727204719.34781: checking for max_fail_percentage 44071 1727204719.34783: done checking for max_fail_percentage 44071 1727204719.34784: checking to see if all hosts have failed and the running result is not ok 44071 1727204719.34784: done checking to see if all hosts have failed 44071 1727204719.34785: getting the remaining hosts for this loop 44071 1727204719.34786: done getting the remaining hosts for this loop 44071 1727204719.34791: getting the next task for host managed-node2 44071 1727204719.34799: done getting next task for host managed-node2 44071 1727204719.34802: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204719.34810: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204719.34824: getting variables 44071 1727204719.34825: in VariableManager get_vars() 44071 1727204719.35176: Calling all_inventory to load vars for managed-node2 44071 1727204719.35180: Calling groups_inventory to load vars for managed-node2 44071 1727204719.35183: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204719.35200: Calling all_plugins_play to load vars for managed-node2 44071 1727204719.35204: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204719.35213: Calling groups_plugins_play to load vars for managed-node2 44071 1727204719.41830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204719.47691: done with get_vars() 44071 1727204719.47754: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:05:19 -0400 (0:00:02.717) 0:02:11.795 ***** 44071 1727204719.47954: entering _queue_task() for managed-node2/package_facts 44071 1727204719.48522: worker is 1 (out of 1 available) 44071 1727204719.48542: exiting _queue_task() for managed-node2/package_facts 44071 1727204719.48559: done queuing things up, now waiting for results queue to drain 44071 1727204719.48561: waiting for pending results... 44071 1727204719.49027: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204719.49228: in run() - task 127b8e07-fff9-c964-7471-000000002206 44071 1727204719.49262: variable 'ansible_search_path' from source: unknown 44071 1727204719.49335: variable 'ansible_search_path' from source: unknown 44071 1727204719.49339: calling self._execute() 44071 1727204719.49477: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204719.49490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204719.49503: variable 'omit' from source: magic vars 44071 1727204719.50052: variable 'ansible_distribution_major_version' from source: facts 44071 1727204719.50077: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204719.50095: variable 'omit' from source: magic vars 44071 1727204719.50201: variable 'omit' from source: magic vars 44071 1727204719.50270: variable 'omit' from source: magic vars 44071 1727204719.50336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204719.50418: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204719.50422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204719.50448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204719.50469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204719.50509: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204719.50553: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204719.50557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204719.50680: Set connection var ansible_connection to ssh 44071 1727204719.50694: Set connection var ansible_timeout to 10 44071 1727204719.50741: Set connection var ansible_pipelining to False 44071 1727204719.50745: Set connection var ansible_shell_type to sh 44071 1727204719.50747: Set connection var ansible_shell_executable to /bin/sh 44071 1727204719.50759: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204719.50799: variable 'ansible_shell_executable' from source: unknown 44071 1727204719.50850: variable 'ansible_connection' from source: unknown 44071 1727204719.50854: variable 'ansible_module_compression' from source: unknown 44071 1727204719.50856: variable 'ansible_shell_type' from source: unknown 44071 1727204719.50858: variable 'ansible_shell_executable' from source: unknown 44071 1727204719.50860: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204719.50863: variable 'ansible_pipelining' from source: unknown 44071 1727204719.50871: variable 'ansible_timeout' from source: unknown 44071 1727204719.50874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204719.51133: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204719.51154: variable 'omit' from source: magic vars 44071 1727204719.51200: starting attempt loop 44071 1727204719.51205: running the handler 44071 1727204719.51212: _low_level_execute_command(): starting 44071 1727204719.51227: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204719.52196: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204719.52236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204719.52369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204719.52376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204719.52406: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204719.52421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204719.52443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204719.52609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204719.54402: stdout chunk (state=3): >>>/root <<< 44071 1727204719.54840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204719.54845: stdout chunk (state=3): >>><<< 44071 1727204719.54859: stderr chunk (state=3): >>><<< 44071 1727204719.54928: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204719.55189: _low_level_execute_command(): starting 44071 1727204719.55200: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204719.5493886-51381-101859437013208 `" && echo ansible-tmp-1727204719.5493886-51381-101859437013208="` echo /root/.ansible/tmp/ansible-tmp-1727204719.5493886-51381-101859437013208 `" ) && sleep 0' 44071 1727204719.57058: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204719.57199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204719.57271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204719.57297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204719.57505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204719.59486: stdout chunk (state=3): >>>ansible-tmp-1727204719.5493886-51381-101859437013208=/root/.ansible/tmp/ansible-tmp-1727204719.5493886-51381-101859437013208 <<< 44071 1727204719.59638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204719.59774: stderr chunk (state=3): >>><<< 44071 1727204719.59789: stdout chunk (state=3): >>><<< 44071 1727204719.59836: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204719.5493886-51381-101859437013208=/root/.ansible/tmp/ansible-tmp-1727204719.5493886-51381-101859437013208 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204719.59943: variable 'ansible_module_compression' from source: unknown 44071 1727204719.59985: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44071 1727204719.60069: variable 'ansible_facts' from source: unknown 44071 1727204719.60310: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204719.5493886-51381-101859437013208/AnsiballZ_package_facts.py 44071 1727204719.60498: Sending initial data 44071 1727204719.60545: Sent initial data (162 bytes) 44071 1727204719.61501: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204719.61510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204719.61558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204719.61591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204719.61659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204719.61771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204719.63430: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204719.63549: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204719.63656: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpu89axq0_ /root/.ansible/tmp/ansible-tmp-1727204719.5493886-51381-101859437013208/AnsiballZ_package_facts.py <<< 44071 1727204719.63660: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204719.5493886-51381-101859437013208/AnsiballZ_package_facts.py" <<< 44071 1727204719.63759: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpu89axq0_" to remote "/root/.ansible/tmp/ansible-tmp-1727204719.5493886-51381-101859437013208/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204719.5493886-51381-101859437013208/AnsiballZ_package_facts.py" <<< 44071 1727204719.65890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204719.65974: stderr chunk (state=3): >>><<< 44071 1727204719.65977: stdout chunk (state=3): >>><<< 44071 1727204719.65983: done transferring module to remote 44071 1727204719.66001: _low_level_execute_command(): starting 44071 1727204719.66010: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204719.5493886-51381-101859437013208/ /root/.ansible/tmp/ansible-tmp-1727204719.5493886-51381-101859437013208/AnsiballZ_package_facts.py && sleep 0' 44071 1727204719.67039: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204719.67160: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204719.67200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204719.67290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204719.67307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204719.67362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204719.67606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204719.69791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204719.69796: stderr chunk (state=3): >>><<< 44071 1727204719.69798: stdout chunk (state=3): >>><<< 44071 1727204719.69845: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204719.69850: _low_level_execute_command(): starting 44071 1727204719.69852: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204719.5493886-51381-101859437013208/AnsiballZ_package_facts.py && sleep 0' 44071 1727204719.71398: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204719.71402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204719.71487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204719.71510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204719.71523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204719.71623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204719.71738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204720.34084: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, <<< 44071 1727204720.34153: stdout chunk (state=3): >>>"arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.<<< 44071 1727204720.34281: stdout chunk (state=3): >>>fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1"<<< 44071 1727204720.34304: stdout chunk (state=3): >>>, "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "<<< 44071 1727204720.34318: stdout chunk (state=3): >>>rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarc<<< 44071 1727204720.34326: stdout chunk (state=3): >>>h", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 44071 1727204720.34378: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44071 1727204720.36284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204720.36343: stderr chunk (state=3): >>><<< 44071 1727204720.36347: stdout chunk (state=3): >>><<< 44071 1727204720.36389: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204720.38502: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204719.5493886-51381-101859437013208/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204720.38517: _low_level_execute_command(): starting 44071 1727204720.38521: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204719.5493886-51381-101859437013208/ > /dev/null 2>&1 && sleep 0' 44071 1727204720.39049: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204720.39055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204720.39058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204720.39060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204720.39117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204720.39121: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204720.39202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204720.41154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204720.41215: stderr chunk (state=3): >>><<< 44071 1727204720.41219: stdout chunk (state=3): >>><<< 44071 1727204720.41236: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204720.41244: handler run complete 44071 1727204720.41922: variable 'ansible_facts' from source: unknown 44071 1727204720.42313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204720.44527: variable 'ansible_facts' from source: unknown 44071 1727204720.44979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204720.45689: attempt loop complete, returning result 44071 1727204720.45710: _execute() done 44071 1727204720.45714: dumping result to json 44071 1727204720.45898: done dumping result, returning 44071 1727204720.45908: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-c964-7471-000000002206] 44071 1727204720.45913: sending task result for task 127b8e07-fff9-c964-7471-000000002206 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204720.49445: no more pending results, returning what we have 44071 1727204720.49450: results queue empty 44071 1727204720.49451: checking for any_errors_fatal 44071 1727204720.49459: done checking for any_errors_fatal 44071 1727204720.49460: checking for max_fail_percentage 44071 1727204720.49462: done checking for max_fail_percentage 44071 1727204720.49463: checking to see if all hosts have failed and the running result is not ok 44071 1727204720.49464: done checking to see if all hosts have failed 44071 1727204720.49464: getting the remaining hosts for this loop 44071 1727204720.49479: done getting the remaining hosts for this loop 44071 1727204720.49484: getting the next task for host managed-node2 44071 1727204720.49494: done getting next task for host managed-node2 44071 1727204720.49498: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204720.49511: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204720.49525: done sending task result for task 127b8e07-fff9-c964-7471-000000002206 44071 1727204720.49530: WORKER PROCESS EXITING 44071 1727204720.49541: getting variables 44071 1727204720.49548: in VariableManager get_vars() 44071 1727204720.49594: Calling all_inventory to load vars for managed-node2 44071 1727204720.49598: Calling groups_inventory to load vars for managed-node2 44071 1727204720.49601: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204720.49621: Calling all_plugins_play to load vars for managed-node2 44071 1727204720.49626: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204720.49630: Calling groups_plugins_play to load vars for managed-node2 44071 1727204720.51982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204720.53437: done with get_vars() 44071 1727204720.53482: done getting variables 44071 1727204720.53557: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:05:20 -0400 (0:00:01.056) 0:02:12.852 ***** 44071 1727204720.53596: entering _queue_task() for managed-node2/debug 44071 1727204720.53953: worker is 1 (out of 1 available) 44071 1727204720.53972: exiting _queue_task() for managed-node2/debug 44071 1727204720.53986: done queuing things up, now waiting for results queue to drain 44071 1727204720.53988: waiting for pending results... 44071 1727204720.54240: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204720.54356: in run() - task 127b8e07-fff9-c964-7471-0000000021a4 44071 1727204720.54371: variable 'ansible_search_path' from source: unknown 44071 1727204720.54375: variable 'ansible_search_path' from source: unknown 44071 1727204720.54416: calling self._execute() 44071 1727204720.54509: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204720.54513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204720.54522: variable 'omit' from source: magic vars 44071 1727204720.54865: variable 'ansible_distribution_major_version' from source: facts 44071 1727204720.54878: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204720.54884: variable 'omit' from source: magic vars 44071 1727204720.54944: variable 'omit' from source: magic vars 44071 1727204720.55023: variable 'network_provider' from source: set_fact 44071 1727204720.55054: variable 'omit' from source: magic vars 44071 1727204720.55092: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204720.55123: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204720.55147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204720.55171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204720.55191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204720.55220: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204720.55224: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204720.55227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204720.55340: Set connection var ansible_connection to ssh 44071 1727204720.55346: Set connection var ansible_timeout to 10 44071 1727204720.55348: Set connection var ansible_pipelining to False 44071 1727204720.55351: Set connection var ansible_shell_type to sh 44071 1727204720.55353: Set connection var ansible_shell_executable to /bin/sh 44071 1727204720.55382: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204720.55387: variable 'ansible_shell_executable' from source: unknown 44071 1727204720.55391: variable 'ansible_connection' from source: unknown 44071 1727204720.55393: variable 'ansible_module_compression' from source: unknown 44071 1727204720.55395: variable 'ansible_shell_type' from source: unknown 44071 1727204720.55400: variable 'ansible_shell_executable' from source: unknown 44071 1727204720.55402: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204720.55407: variable 'ansible_pipelining' from source: unknown 44071 1727204720.55409: variable 'ansible_timeout' from source: unknown 44071 1727204720.55414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204720.55555: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204720.55569: variable 'omit' from source: magic vars 44071 1727204720.55574: starting attempt loop 44071 1727204720.55577: running the handler 44071 1727204720.55649: handler run complete 44071 1727204720.55663: attempt loop complete, returning result 44071 1727204720.55669: _execute() done 44071 1727204720.55672: dumping result to json 44071 1727204720.55674: done dumping result, returning 44071 1727204720.55687: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-c964-7471-0000000021a4] 44071 1727204720.55690: sending task result for task 127b8e07-fff9-c964-7471-0000000021a4 44071 1727204720.55792: done sending task result for task 127b8e07-fff9-c964-7471-0000000021a4 44071 1727204720.55795: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 44071 1727204720.55884: no more pending results, returning what we have 44071 1727204720.55893: results queue empty 44071 1727204720.55894: checking for any_errors_fatal 44071 1727204720.55904: done checking for any_errors_fatal 44071 1727204720.55904: checking for max_fail_percentage 44071 1727204720.55906: done checking for max_fail_percentage 44071 1727204720.55907: checking to see if all hosts have failed and the running result is not ok 44071 1727204720.55908: done checking to see if all hosts have failed 44071 1727204720.55908: getting the remaining hosts for this loop 44071 1727204720.55911: done getting the remaining hosts for this loop 44071 1727204720.55917: getting the next task for host managed-node2 44071 1727204720.55926: done getting next task for host managed-node2 44071 1727204720.55930: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204720.55936: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204720.55948: getting variables 44071 1727204720.55950: in VariableManager get_vars() 44071 1727204720.56055: Calling all_inventory to load vars for managed-node2 44071 1727204720.56058: Calling groups_inventory to load vars for managed-node2 44071 1727204720.56060: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204720.56073: Calling all_plugins_play to load vars for managed-node2 44071 1727204720.56075: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204720.56085: Calling groups_plugins_play to load vars for managed-node2 44071 1727204720.57670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204720.59081: done with get_vars() 44071 1727204720.59124: done getting variables 44071 1727204720.59189: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.056) 0:02:12.908 ***** 44071 1727204720.59225: entering _queue_task() for managed-node2/fail 44071 1727204720.59563: worker is 1 (out of 1 available) 44071 1727204720.59582: exiting _queue_task() for managed-node2/fail 44071 1727204720.59596: done queuing things up, now waiting for results queue to drain 44071 1727204720.59598: waiting for pending results... 44071 1727204720.59817: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204720.59958: in run() - task 127b8e07-fff9-c964-7471-0000000021a5 44071 1727204720.59974: variable 'ansible_search_path' from source: unknown 44071 1727204720.59978: variable 'ansible_search_path' from source: unknown 44071 1727204720.60026: calling self._execute() 44071 1727204720.60153: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204720.60171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204720.60182: variable 'omit' from source: magic vars 44071 1727204720.60683: variable 'ansible_distribution_major_version' from source: facts 44071 1727204720.60693: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204720.60801: variable 'network_state' from source: role '' defaults 44071 1727204720.60812: Evaluated conditional (network_state != {}): False 44071 1727204720.60817: when evaluation is False, skipping this task 44071 1727204720.60820: _execute() done 44071 1727204720.60824: dumping result to json 44071 1727204720.60833: done dumping result, returning 44071 1727204720.60841: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-c964-7471-0000000021a5] 44071 1727204720.60845: sending task result for task 127b8e07-fff9-c964-7471-0000000021a5 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204720.61018: no more pending results, returning what we have 44071 1727204720.61023: results queue empty 44071 1727204720.61024: checking for any_errors_fatal 44071 1727204720.61035: done checking for any_errors_fatal 44071 1727204720.61036: checking for max_fail_percentage 44071 1727204720.61039: done checking for max_fail_percentage 44071 1727204720.61040: checking to see if all hosts have failed and the running result is not ok 44071 1727204720.61042: done checking to see if all hosts have failed 44071 1727204720.61045: getting the remaining hosts for this loop 44071 1727204720.61047: done getting the remaining hosts for this loop 44071 1727204720.61054: getting the next task for host managed-node2 44071 1727204720.61069: done getting next task for host managed-node2 44071 1727204720.61073: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204720.61079: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204720.61109: getting variables 44071 1727204720.61111: in VariableManager get_vars() 44071 1727204720.61209: Calling all_inventory to load vars for managed-node2 44071 1727204720.61213: Calling groups_inventory to load vars for managed-node2 44071 1727204720.61216: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204720.61223: done sending task result for task 127b8e07-fff9-c964-7471-0000000021a5 44071 1727204720.61225: WORKER PROCESS EXITING 44071 1727204720.61344: Calling all_plugins_play to load vars for managed-node2 44071 1727204720.61349: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204720.61354: Calling groups_plugins_play to load vars for managed-node2 44071 1727204720.63188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204720.65194: done with get_vars() 44071 1727204720.65247: done getting variables 44071 1727204720.65321: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.061) 0:02:12.970 ***** 44071 1727204720.65368: entering _queue_task() for managed-node2/fail 44071 1727204720.65782: worker is 1 (out of 1 available) 44071 1727204720.65798: exiting _queue_task() for managed-node2/fail 44071 1727204720.65814: done queuing things up, now waiting for results queue to drain 44071 1727204720.65815: waiting for pending results... 44071 1727204720.66044: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204720.66238: in run() - task 127b8e07-fff9-c964-7471-0000000021a6 44071 1727204720.66242: variable 'ansible_search_path' from source: unknown 44071 1727204720.66245: variable 'ansible_search_path' from source: unknown 44071 1727204720.66285: calling self._execute() 44071 1727204720.66391: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204720.66396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204720.66406: variable 'omit' from source: magic vars 44071 1727204720.66718: variable 'ansible_distribution_major_version' from source: facts 44071 1727204720.66730: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204720.66831: variable 'network_state' from source: role '' defaults 44071 1727204720.66840: Evaluated conditional (network_state != {}): False 44071 1727204720.66844: when evaluation is False, skipping this task 44071 1727204720.66847: _execute() done 44071 1727204720.66850: dumping result to json 44071 1727204720.66853: done dumping result, returning 44071 1727204720.66862: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-c964-7471-0000000021a6] 44071 1727204720.66866: sending task result for task 127b8e07-fff9-c964-7471-0000000021a6 44071 1727204720.66979: done sending task result for task 127b8e07-fff9-c964-7471-0000000021a6 44071 1727204720.66983: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204720.67044: no more pending results, returning what we have 44071 1727204720.67048: results queue empty 44071 1727204720.67049: checking for any_errors_fatal 44071 1727204720.67063: done checking for any_errors_fatal 44071 1727204720.67064: checking for max_fail_percentage 44071 1727204720.67068: done checking for max_fail_percentage 44071 1727204720.67070: checking to see if all hosts have failed and the running result is not ok 44071 1727204720.67070: done checking to see if all hosts have failed 44071 1727204720.67071: getting the remaining hosts for this loop 44071 1727204720.67073: done getting the remaining hosts for this loop 44071 1727204720.67077: getting the next task for host managed-node2 44071 1727204720.67087: done getting next task for host managed-node2 44071 1727204720.67091: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204720.67098: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204720.67120: getting variables 44071 1727204720.67121: in VariableManager get_vars() 44071 1727204720.67176: Calling all_inventory to load vars for managed-node2 44071 1727204720.67180: Calling groups_inventory to load vars for managed-node2 44071 1727204720.67182: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204720.67192: Calling all_plugins_play to load vars for managed-node2 44071 1727204720.67195: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204720.67197: Calling groups_plugins_play to load vars for managed-node2 44071 1727204720.78663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204720.81611: done with get_vars() 44071 1727204720.81661: done getting variables 44071 1727204720.81723: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.163) 0:02:13.134 ***** 44071 1727204720.81759: entering _queue_task() for managed-node2/fail 44071 1727204720.82679: worker is 1 (out of 1 available) 44071 1727204720.82695: exiting _queue_task() for managed-node2/fail 44071 1727204720.82710: done queuing things up, now waiting for results queue to drain 44071 1727204720.82713: waiting for pending results... 44071 1727204720.83540: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204720.83777: in run() - task 127b8e07-fff9-c964-7471-0000000021a7 44071 1727204720.83783: variable 'ansible_search_path' from source: unknown 44071 1727204720.83787: variable 'ansible_search_path' from source: unknown 44071 1727204720.83897: calling self._execute() 44071 1727204720.84153: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204720.84222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204720.84238: variable 'omit' from source: magic vars 44071 1727204720.85209: variable 'ansible_distribution_major_version' from source: facts 44071 1727204720.85360: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204720.85746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204720.89717: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204720.89849: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204720.89903: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204720.89974: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204720.89996: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204720.90136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204720.90155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204720.90201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204720.90261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204720.90409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204720.90441: variable 'ansible_distribution_major_version' from source: facts 44071 1727204720.90467: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44071 1727204720.90637: variable 'ansible_distribution' from source: facts 44071 1727204720.90650: variable '__network_rh_distros' from source: role '' defaults 44071 1727204720.90664: Evaluated conditional (ansible_distribution in __network_rh_distros): False 44071 1727204720.90677: when evaluation is False, skipping this task 44071 1727204720.90738: _execute() done 44071 1727204720.90742: dumping result to json 44071 1727204720.90745: done dumping result, returning 44071 1727204720.90752: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-c964-7471-0000000021a7] 44071 1727204720.90755: sending task result for task 127b8e07-fff9-c964-7471-0000000021a7 44071 1727204720.90993: done sending task result for task 127b8e07-fff9-c964-7471-0000000021a7 44071 1727204720.90997: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 44071 1727204720.91049: no more pending results, returning what we have 44071 1727204720.91053: results queue empty 44071 1727204720.91054: checking for any_errors_fatal 44071 1727204720.91070: done checking for any_errors_fatal 44071 1727204720.91072: checking for max_fail_percentage 44071 1727204720.91073: done checking for max_fail_percentage 44071 1727204720.91074: checking to see if all hosts have failed and the running result is not ok 44071 1727204720.91075: done checking to see if all hosts have failed 44071 1727204720.91076: getting the remaining hosts for this loop 44071 1727204720.91078: done getting the remaining hosts for this loop 44071 1727204720.91083: getting the next task for host managed-node2 44071 1727204720.91092: done getting next task for host managed-node2 44071 1727204720.91097: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204720.91103: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204720.91126: getting variables 44071 1727204720.91128: in VariableManager get_vars() 44071 1727204720.91278: Calling all_inventory to load vars for managed-node2 44071 1727204720.91281: Calling groups_inventory to load vars for managed-node2 44071 1727204720.91283: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204720.91294: Calling all_plugins_play to load vars for managed-node2 44071 1727204720.91296: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204720.91299: Calling groups_plugins_play to load vars for managed-node2 44071 1727204720.93651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204720.97512: done with get_vars() 44071 1727204720.97554: done getting variables 44071 1727204720.97634: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.160) 0:02:13.294 ***** 44071 1727204720.97831: entering _queue_task() for managed-node2/dnf 44071 1727204720.98733: worker is 1 (out of 1 available) 44071 1727204720.98751: exiting _queue_task() for managed-node2/dnf 44071 1727204720.98769: done queuing things up, now waiting for results queue to drain 44071 1727204720.98771: waiting for pending results... 44071 1727204720.99547: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204720.99739: in run() - task 127b8e07-fff9-c964-7471-0000000021a8 44071 1727204720.99753: variable 'ansible_search_path' from source: unknown 44071 1727204720.99764: variable 'ansible_search_path' from source: unknown 44071 1727204720.99974: calling self._execute() 44071 1727204721.00117: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204721.00124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204721.00137: variable 'omit' from source: magic vars 44071 1727204721.01051: variable 'ansible_distribution_major_version' from source: facts 44071 1727204721.01070: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204721.01758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204721.06539: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204721.06660: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204721.06703: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204721.06743: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204721.06773: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204721.06867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.06921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.06976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.07049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.07064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.07219: variable 'ansible_distribution' from source: facts 44071 1727204721.07224: variable 'ansible_distribution_major_version' from source: facts 44071 1727204721.07236: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44071 1727204721.07369: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204721.07512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.07539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.07575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.07672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.07683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.07687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.07691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.07716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.07752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.07764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.07809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.07831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.07854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.07892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.07977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.08160: variable 'network_connections' from source: include params 44071 1727204721.08175: variable 'interface' from source: play vars 44071 1727204721.08250: variable 'interface' from source: play vars 44071 1727204721.08340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204721.08538: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204721.08594: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204721.08632: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204721.08669: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204721.08719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204721.08742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204721.08775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.08824: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204721.08942: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204721.09271: variable 'network_connections' from source: include params 44071 1727204721.09275: variable 'interface' from source: play vars 44071 1727204721.09299: variable 'interface' from source: play vars 44071 1727204721.09380: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204721.09386: when evaluation is False, skipping this task 44071 1727204721.09393: _execute() done 44071 1727204721.09395: dumping result to json 44071 1727204721.09398: done dumping result, returning 44071 1727204721.09400: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-0000000021a8] 44071 1727204721.09403: sending task result for task 127b8e07-fff9-c964-7471-0000000021a8 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204721.09730: no more pending results, returning what we have 44071 1727204721.09734: results queue empty 44071 1727204721.09736: checking for any_errors_fatal 44071 1727204721.09742: done checking for any_errors_fatal 44071 1727204721.09743: checking for max_fail_percentage 44071 1727204721.09745: done checking for max_fail_percentage 44071 1727204721.09746: checking to see if all hosts have failed and the running result is not ok 44071 1727204721.09746: done checking to see if all hosts have failed 44071 1727204721.09747: getting the remaining hosts for this loop 44071 1727204721.09750: done getting the remaining hosts for this loop 44071 1727204721.09755: getting the next task for host managed-node2 44071 1727204721.09764: done getting next task for host managed-node2 44071 1727204721.09770: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204721.09776: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204721.09801: getting variables 44071 1727204721.09803: in VariableManager get_vars() 44071 1727204721.09855: Calling all_inventory to load vars for managed-node2 44071 1727204721.09858: Calling groups_inventory to load vars for managed-node2 44071 1727204721.09860: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204721.09870: done sending task result for task 127b8e07-fff9-c964-7471-0000000021a8 44071 1727204721.09874: WORKER PROCESS EXITING 44071 1727204721.10198: Calling all_plugins_play to load vars for managed-node2 44071 1727204721.10202: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204721.10205: Calling groups_plugins_play to load vars for managed-node2 44071 1727204721.12602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204721.16006: done with get_vars() 44071 1727204721.16052: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204721.16135: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.183) 0:02:13.479 ***** 44071 1727204721.16334: entering _queue_task() for managed-node2/yum 44071 1727204721.16955: worker is 1 (out of 1 available) 44071 1727204721.17189: exiting _queue_task() for managed-node2/yum 44071 1727204721.17203: done queuing things up, now waiting for results queue to drain 44071 1727204721.17205: waiting for pending results... 44071 1727204721.17356: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204721.17513: in run() - task 127b8e07-fff9-c964-7471-0000000021a9 44071 1727204721.17529: variable 'ansible_search_path' from source: unknown 44071 1727204721.17538: variable 'ansible_search_path' from source: unknown 44071 1727204721.17603: calling self._execute() 44071 1727204721.17710: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204721.17772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204721.17779: variable 'omit' from source: magic vars 44071 1727204721.18157: variable 'ansible_distribution_major_version' from source: facts 44071 1727204721.18173: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204721.18374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204721.23377: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204721.23382: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204721.23448: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204721.23488: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204721.23516: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204721.23728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.23768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.23795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.23835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.23864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.23974: variable 'ansible_distribution_major_version' from source: facts 44071 1727204721.23993: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44071 1727204721.23997: when evaluation is False, skipping this task 44071 1727204721.24000: _execute() done 44071 1727204721.24003: dumping result to json 44071 1727204721.24005: done dumping result, returning 44071 1727204721.24014: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-0000000021a9] 44071 1727204721.24025: sending task result for task 127b8e07-fff9-c964-7471-0000000021a9 44071 1727204721.24196: done sending task result for task 127b8e07-fff9-c964-7471-0000000021a9 44071 1727204721.24201: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44071 1727204721.24261: no more pending results, returning what we have 44071 1727204721.24264: results queue empty 44071 1727204721.24268: checking for any_errors_fatal 44071 1727204721.24276: done checking for any_errors_fatal 44071 1727204721.24276: checking for max_fail_percentage 44071 1727204721.24278: done checking for max_fail_percentage 44071 1727204721.24279: checking to see if all hosts have failed and the running result is not ok 44071 1727204721.24280: done checking to see if all hosts have failed 44071 1727204721.24280: getting the remaining hosts for this loop 44071 1727204721.24282: done getting the remaining hosts for this loop 44071 1727204721.24291: getting the next task for host managed-node2 44071 1727204721.24301: done getting next task for host managed-node2 44071 1727204721.24306: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204721.24311: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204721.24332: getting variables 44071 1727204721.24333: in VariableManager get_vars() 44071 1727204721.24525: Calling all_inventory to load vars for managed-node2 44071 1727204721.24528: Calling groups_inventory to load vars for managed-node2 44071 1727204721.24530: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204721.24541: Calling all_plugins_play to load vars for managed-node2 44071 1727204721.24544: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204721.24547: Calling groups_plugins_play to load vars for managed-node2 44071 1727204721.27910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204721.30165: done with get_vars() 44071 1727204721.30208: done getting variables 44071 1727204721.30286: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.139) 0:02:13.619 ***** 44071 1727204721.30330: entering _queue_task() for managed-node2/fail 44071 1727204721.30788: worker is 1 (out of 1 available) 44071 1727204721.30804: exiting _queue_task() for managed-node2/fail 44071 1727204721.30933: done queuing things up, now waiting for results queue to drain 44071 1727204721.30935: waiting for pending results... 44071 1727204721.31271: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204721.31387: in run() - task 127b8e07-fff9-c964-7471-0000000021aa 44071 1727204721.31414: variable 'ansible_search_path' from source: unknown 44071 1727204721.31423: variable 'ansible_search_path' from source: unknown 44071 1727204721.31585: calling self._execute() 44071 1727204721.31618: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204721.31633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204721.31650: variable 'omit' from source: magic vars 44071 1727204721.32112: variable 'ansible_distribution_major_version' from source: facts 44071 1727204721.32140: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204721.32292: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204721.32538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204721.35275: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204721.35303: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204721.35353: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204721.35410: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204721.35445: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204721.35555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.35596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.35639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.35693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.35772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.35787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.35818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.35857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.35908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.35929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.35990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.36022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.36071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.36112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.36160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.36350: variable 'network_connections' from source: include params 44071 1727204721.36374: variable 'interface' from source: play vars 44071 1727204721.36454: variable 'interface' from source: play vars 44071 1727204721.36592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204721.36760: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204721.36824: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204721.36857: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204721.36892: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204721.36956: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204721.36984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204721.37025: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.37051: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204721.37134: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204721.37671: variable 'network_connections' from source: include params 44071 1727204721.37675: variable 'interface' from source: play vars 44071 1727204721.37678: variable 'interface' from source: play vars 44071 1727204721.37680: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204721.37683: when evaluation is False, skipping this task 44071 1727204721.37685: _execute() done 44071 1727204721.37687: dumping result to json 44071 1727204721.37689: done dumping result, returning 44071 1727204721.37691: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-0000000021aa] 44071 1727204721.37693: sending task result for task 127b8e07-fff9-c964-7471-0000000021aa 44071 1727204721.37793: done sending task result for task 127b8e07-fff9-c964-7471-0000000021aa 44071 1727204721.37796: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204721.37856: no more pending results, returning what we have 44071 1727204721.37860: results queue empty 44071 1727204721.37861: checking for any_errors_fatal 44071 1727204721.37872: done checking for any_errors_fatal 44071 1727204721.37873: checking for max_fail_percentage 44071 1727204721.37875: done checking for max_fail_percentage 44071 1727204721.37878: checking to see if all hosts have failed and the running result is not ok 44071 1727204721.37879: done checking to see if all hosts have failed 44071 1727204721.37879: getting the remaining hosts for this loop 44071 1727204721.37881: done getting the remaining hosts for this loop 44071 1727204721.37887: getting the next task for host managed-node2 44071 1727204721.37896: done getting next task for host managed-node2 44071 1727204721.37900: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44071 1727204721.37906: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204721.37927: getting variables 44071 1727204721.37929: in VariableManager get_vars() 44071 1727204721.37983: Calling all_inventory to load vars for managed-node2 44071 1727204721.37986: Calling groups_inventory to load vars for managed-node2 44071 1727204721.37988: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204721.38000: Calling all_plugins_play to load vars for managed-node2 44071 1727204721.38002: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204721.38005: Calling groups_plugins_play to load vars for managed-node2 44071 1727204721.40281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204721.42788: done with get_vars() 44071 1727204721.42831: done getting variables 44071 1727204721.42907: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.126) 0:02:13.745 ***** 44071 1727204721.42950: entering _queue_task() for managed-node2/package 44071 1727204721.43396: worker is 1 (out of 1 available) 44071 1727204721.43409: exiting _queue_task() for managed-node2/package 44071 1727204721.43538: done queuing things up, now waiting for results queue to drain 44071 1727204721.43540: waiting for pending results... 44071 1727204721.43873: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 44071 1727204721.43954: in run() - task 127b8e07-fff9-c964-7471-0000000021ab 44071 1727204721.43986: variable 'ansible_search_path' from source: unknown 44071 1727204721.44000: variable 'ansible_search_path' from source: unknown 44071 1727204721.44046: calling self._execute() 44071 1727204721.44180: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204721.44198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204721.44216: variable 'omit' from source: magic vars 44071 1727204721.44739: variable 'ansible_distribution_major_version' from source: facts 44071 1727204721.44743: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204721.44959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204721.45298: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204721.45356: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204721.45409: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204721.45517: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204721.45720: variable 'network_packages' from source: role '' defaults 44071 1727204721.45800: variable '__network_provider_setup' from source: role '' defaults 44071 1727204721.45819: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204721.45898: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204721.45912: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204721.45991: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204721.46203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204721.48552: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204721.48651: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204721.48719: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204721.48748: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204721.48867: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204721.48904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.48940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.48981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.49070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.49073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.49117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.49147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.49180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.49232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.49249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.49624: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204721.49693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.49722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.49759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.49804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.49823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.49934: variable 'ansible_python' from source: facts 44071 1727204721.49969: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204721.50076: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204721.50173: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204721.50324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.50384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.50391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.50444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.50469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.50600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.50612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.50615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.50631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.50648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.50836: variable 'network_connections' from source: include params 44071 1727204721.50854: variable 'interface' from source: play vars 44071 1727204721.50988: variable 'interface' from source: play vars 44071 1727204721.51091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204721.51145: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204721.51172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.51216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204721.51362: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204721.51655: variable 'network_connections' from source: include params 44071 1727204721.51671: variable 'interface' from source: play vars 44071 1727204721.51804: variable 'interface' from source: play vars 44071 1727204721.51881: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204721.51986: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204721.52503: variable 'network_connections' from source: include params 44071 1727204721.52711: variable 'interface' from source: play vars 44071 1727204721.52714: variable 'interface' from source: play vars 44071 1727204721.52746: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204721.52969: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204721.53392: variable 'network_connections' from source: include params 44071 1727204721.53405: variable 'interface' from source: play vars 44071 1727204721.53498: variable 'interface' from source: play vars 44071 1727204721.53595: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204721.53681: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204721.53698: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204721.53774: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204721.54055: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204721.54685: variable 'network_connections' from source: include params 44071 1727204721.54695: variable 'interface' from source: play vars 44071 1727204721.54763: variable 'interface' from source: play vars 44071 1727204721.54792: variable 'ansible_distribution' from source: facts 44071 1727204721.54800: variable '__network_rh_distros' from source: role '' defaults 44071 1727204721.54807: variable 'ansible_distribution_major_version' from source: facts 44071 1727204721.54837: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204721.55371: variable 'ansible_distribution' from source: facts 44071 1727204721.55375: variable '__network_rh_distros' from source: role '' defaults 44071 1727204721.55377: variable 'ansible_distribution_major_version' from source: facts 44071 1727204721.55380: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204721.55632: variable 'ansible_distribution' from source: facts 44071 1727204721.55659: variable '__network_rh_distros' from source: role '' defaults 44071 1727204721.55672: variable 'ansible_distribution_major_version' from source: facts 44071 1727204721.55734: variable 'network_provider' from source: set_fact 44071 1727204721.55776: variable 'ansible_facts' from source: unknown 44071 1727204721.56888: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44071 1727204721.56902: when evaluation is False, skipping this task 44071 1727204721.56912: _execute() done 44071 1727204721.56935: dumping result to json 44071 1727204721.56946: done dumping result, returning 44071 1727204721.56961: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-c964-7471-0000000021ab] 44071 1727204721.56974: sending task result for task 127b8e07-fff9-c964-7471-0000000021ab skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44071 1727204721.57326: no more pending results, returning what we have 44071 1727204721.57330: results queue empty 44071 1727204721.57331: checking for any_errors_fatal 44071 1727204721.57341: done checking for any_errors_fatal 44071 1727204721.57342: checking for max_fail_percentage 44071 1727204721.57343: done checking for max_fail_percentage 44071 1727204721.57345: checking to see if all hosts have failed and the running result is not ok 44071 1727204721.57345: done checking to see if all hosts have failed 44071 1727204721.57346: getting the remaining hosts for this loop 44071 1727204721.57348: done getting the remaining hosts for this loop 44071 1727204721.57354: getting the next task for host managed-node2 44071 1727204721.57363: done getting next task for host managed-node2 44071 1727204721.57371: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204721.57378: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204721.57403: getting variables 44071 1727204721.57405: in VariableManager get_vars() 44071 1727204721.57460: Calling all_inventory to load vars for managed-node2 44071 1727204721.57464: Calling groups_inventory to load vars for managed-node2 44071 1727204721.57725: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204721.57739: Calling all_plugins_play to load vars for managed-node2 44071 1727204721.57743: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204721.57747: Calling groups_plugins_play to load vars for managed-node2 44071 1727204721.58315: done sending task result for task 127b8e07-fff9-c964-7471-0000000021ab 44071 1727204721.58320: WORKER PROCESS EXITING 44071 1727204721.59936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204721.62867: done with get_vars() 44071 1727204721.62915: done getting variables 44071 1727204721.62991: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.200) 0:02:13.946 ***** 44071 1727204721.63029: entering _queue_task() for managed-node2/package 44071 1727204721.63609: worker is 1 (out of 1 available) 44071 1727204721.63630: exiting _queue_task() for managed-node2/package 44071 1727204721.63643: done queuing things up, now waiting for results queue to drain 44071 1727204721.63645: waiting for pending results... 44071 1727204721.64052: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204721.64643: in run() - task 127b8e07-fff9-c964-7471-0000000021ac 44071 1727204721.64723: variable 'ansible_search_path' from source: unknown 44071 1727204721.64728: variable 'ansible_search_path' from source: unknown 44071 1727204721.64770: calling self._execute() 44071 1727204721.65008: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204721.65022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204721.65039: variable 'omit' from source: magic vars 44071 1727204721.66140: variable 'ansible_distribution_major_version' from source: facts 44071 1727204721.66146: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204721.66600: variable 'network_state' from source: role '' defaults 44071 1727204721.66689: Evaluated conditional (network_state != {}): False 44071 1727204721.66697: when evaluation is False, skipping this task 44071 1727204721.66703: _execute() done 44071 1727204721.66743: dumping result to json 44071 1727204721.66771: done dumping result, returning 44071 1727204721.66795: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-c964-7471-0000000021ac] 44071 1727204721.66802: sending task result for task 127b8e07-fff9-c964-7471-0000000021ac skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204721.67064: no more pending results, returning what we have 44071 1727204721.67068: results queue empty 44071 1727204721.67069: checking for any_errors_fatal 44071 1727204721.67080: done checking for any_errors_fatal 44071 1727204721.67081: checking for max_fail_percentage 44071 1727204721.67082: done checking for max_fail_percentage 44071 1727204721.67083: checking to see if all hosts have failed and the running result is not ok 44071 1727204721.67084: done checking to see if all hosts have failed 44071 1727204721.67085: getting the remaining hosts for this loop 44071 1727204721.67087: done getting the remaining hosts for this loop 44071 1727204721.67092: getting the next task for host managed-node2 44071 1727204721.67102: done getting next task for host managed-node2 44071 1727204721.67107: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204721.67116: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204721.67140: getting variables 44071 1727204721.67142: in VariableManager get_vars() 44071 1727204721.67317: Calling all_inventory to load vars for managed-node2 44071 1727204721.67321: Calling groups_inventory to load vars for managed-node2 44071 1727204721.67324: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204721.67341: Calling all_plugins_play to load vars for managed-node2 44071 1727204721.67345: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204721.67349: Calling groups_plugins_play to load vars for managed-node2 44071 1727204721.67985: done sending task result for task 127b8e07-fff9-c964-7471-0000000021ac 44071 1727204721.67990: WORKER PROCESS EXITING 44071 1727204721.70978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204721.75157: done with get_vars() 44071 1727204721.75209: done getting variables 44071 1727204721.75280: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.122) 0:02:14.069 ***** 44071 1727204721.75322: entering _queue_task() for managed-node2/package 44071 1727204721.75800: worker is 1 (out of 1 available) 44071 1727204721.75815: exiting _queue_task() for managed-node2/package 44071 1727204721.75944: done queuing things up, now waiting for results queue to drain 44071 1727204721.75947: waiting for pending results... 44071 1727204721.76162: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204721.76472: in run() - task 127b8e07-fff9-c964-7471-0000000021ad 44071 1727204721.76476: variable 'ansible_search_path' from source: unknown 44071 1727204721.76479: variable 'ansible_search_path' from source: unknown 44071 1727204721.76482: calling self._execute() 44071 1727204721.76597: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204721.76620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204721.76635: variable 'omit' from source: magic vars 44071 1727204721.77092: variable 'ansible_distribution_major_version' from source: facts 44071 1727204721.77111: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204721.77251: variable 'network_state' from source: role '' defaults 44071 1727204721.77280: Evaluated conditional (network_state != {}): False 44071 1727204721.77287: when evaluation is False, skipping this task 44071 1727204721.77293: _execute() done 44071 1727204721.77299: dumping result to json 44071 1727204721.77307: done dumping result, returning 44071 1727204721.77319: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-c964-7471-0000000021ad] 44071 1727204721.77374: sending task result for task 127b8e07-fff9-c964-7471-0000000021ad skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204721.77542: no more pending results, returning what we have 44071 1727204721.77547: results queue empty 44071 1727204721.77548: checking for any_errors_fatal 44071 1727204721.77559: done checking for any_errors_fatal 44071 1727204721.77560: checking for max_fail_percentage 44071 1727204721.77562: done checking for max_fail_percentage 44071 1727204721.77563: checking to see if all hosts have failed and the running result is not ok 44071 1727204721.77564: done checking to see if all hosts have failed 44071 1727204721.77565: getting the remaining hosts for this loop 44071 1727204721.77569: done getting the remaining hosts for this loop 44071 1727204721.77574: getting the next task for host managed-node2 44071 1727204721.77586: done getting next task for host managed-node2 44071 1727204721.77591: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204721.77599: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204721.77629: getting variables 44071 1727204721.77631: in VariableManager get_vars() 44071 1727204721.78203: Calling all_inventory to load vars for managed-node2 44071 1727204721.78207: Calling groups_inventory to load vars for managed-node2 44071 1727204721.78210: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204721.78222: Calling all_plugins_play to load vars for managed-node2 44071 1727204721.78226: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204721.78229: Calling groups_plugins_play to load vars for managed-node2 44071 1727204721.78904: done sending task result for task 127b8e07-fff9-c964-7471-0000000021ad 44071 1727204721.78909: WORKER PROCESS EXITING 44071 1727204721.82437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204721.89351: done with get_vars() 44071 1727204721.89450: done getting variables 44071 1727204721.89647: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.143) 0:02:14.213 ***** 44071 1727204721.89692: entering _queue_task() for managed-node2/service 44071 1727204721.90732: worker is 1 (out of 1 available) 44071 1727204721.90750: exiting _queue_task() for managed-node2/service 44071 1727204721.90767: done queuing things up, now waiting for results queue to drain 44071 1727204721.90769: waiting for pending results... 44071 1727204721.91506: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204721.91673: in run() - task 127b8e07-fff9-c964-7471-0000000021ae 44071 1727204721.91703: variable 'ansible_search_path' from source: unknown 44071 1727204721.91708: variable 'ansible_search_path' from source: unknown 44071 1727204721.91740: calling self._execute() 44071 1727204721.91858: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204721.91864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204721.91900: variable 'omit' from source: magic vars 44071 1727204721.92395: variable 'ansible_distribution_major_version' from source: facts 44071 1727204721.92400: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204721.92613: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204721.92802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204721.96619: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204721.96877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204721.97018: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204721.97068: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204721.97096: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204721.97458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.97493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.97521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.97708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.97762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.97879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.97914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.97960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.98016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.98058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.98111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204721.98167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204721.98188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.98244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204721.98274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204721.98543: variable 'network_connections' from source: include params 44071 1727204721.98546: variable 'interface' from source: play vars 44071 1727204721.98607: variable 'interface' from source: play vars 44071 1727204721.98710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204721.98913: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204721.98988: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204721.99028: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204721.99072: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204721.99126: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204721.99167: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204721.99207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204721.99252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204721.99342: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204721.99691: variable 'network_connections' from source: include params 44071 1727204721.99694: variable 'interface' from source: play vars 44071 1727204721.99764: variable 'interface' from source: play vars 44071 1727204721.99811: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204721.99840: when evaluation is False, skipping this task 44071 1727204721.99843: _execute() done 44071 1727204721.99845: dumping result to json 44071 1727204721.99908: done dumping result, returning 44071 1727204721.99911: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-0000000021ae] 44071 1727204721.99914: sending task result for task 127b8e07-fff9-c964-7471-0000000021ae 44071 1727204722.00374: done sending task result for task 127b8e07-fff9-c964-7471-0000000021ae 44071 1727204722.00382: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204722.00477: no more pending results, returning what we have 44071 1727204722.00481: results queue empty 44071 1727204722.00482: checking for any_errors_fatal 44071 1727204722.00494: done checking for any_errors_fatal 44071 1727204722.00495: checking for max_fail_percentage 44071 1727204722.00497: done checking for max_fail_percentage 44071 1727204722.00498: checking to see if all hosts have failed and the running result is not ok 44071 1727204722.00499: done checking to see if all hosts have failed 44071 1727204722.00500: getting the remaining hosts for this loop 44071 1727204722.00501: done getting the remaining hosts for this loop 44071 1727204722.00506: getting the next task for host managed-node2 44071 1727204722.00515: done getting next task for host managed-node2 44071 1727204722.00519: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204722.00524: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204722.00548: getting variables 44071 1727204722.00550: in VariableManager get_vars() 44071 1727204722.00750: Calling all_inventory to load vars for managed-node2 44071 1727204722.00753: Calling groups_inventory to load vars for managed-node2 44071 1727204722.00756: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204722.00975: Calling all_plugins_play to load vars for managed-node2 44071 1727204722.00980: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204722.00985: Calling groups_plugins_play to load vars for managed-node2 44071 1727204722.04480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204722.08395: done with get_vars() 44071 1727204722.08453: done getting variables 44071 1727204722.08682: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.190) 0:02:14.403 ***** 44071 1727204722.08722: entering _queue_task() for managed-node2/service 44071 1727204722.09730: worker is 1 (out of 1 available) 44071 1727204722.09745: exiting _queue_task() for managed-node2/service 44071 1727204722.09761: done queuing things up, now waiting for results queue to drain 44071 1727204722.09762: waiting for pending results... 44071 1727204722.10557: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204722.10813: in run() - task 127b8e07-fff9-c964-7471-0000000021af 44071 1727204722.10838: variable 'ansible_search_path' from source: unknown 44071 1727204722.10842: variable 'ansible_search_path' from source: unknown 44071 1727204722.10921: calling self._execute() 44071 1727204722.11027: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204722.11042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204722.11062: variable 'omit' from source: magic vars 44071 1727204722.11574: variable 'ansible_distribution_major_version' from source: facts 44071 1727204722.11578: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204722.11728: variable 'network_provider' from source: set_fact 44071 1727204722.11740: variable 'network_state' from source: role '' defaults 44071 1727204722.11755: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44071 1727204722.11768: variable 'omit' from source: magic vars 44071 1727204722.11858: variable 'omit' from source: magic vars 44071 1727204722.11901: variable 'network_service_name' from source: role '' defaults 44071 1727204722.11984: variable 'network_service_name' from source: role '' defaults 44071 1727204722.12116: variable '__network_provider_setup' from source: role '' defaults 44071 1727204722.12138: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204722.12227: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204722.12230: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204722.12297: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204722.12543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204722.17042: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204722.17147: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204722.17159: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204722.17206: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204722.17241: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204722.17349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204722.17397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204722.17430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204722.17490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204722.17511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204722.17571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204722.17670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204722.17674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204722.17683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204722.17714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204722.18019: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204722.18101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204722.18140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204722.18172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204722.18217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204722.18244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204722.18360: variable 'ansible_python' from source: facts 44071 1727204722.18386: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204722.18490: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204722.18673: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204722.18733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204722.18765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204722.18805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204722.18847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204722.18863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204722.18924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204722.18958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204722.18991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204722.19049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204722.19074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204722.19252: variable 'network_connections' from source: include params 44071 1727204722.19268: variable 'interface' from source: play vars 44071 1727204722.19364: variable 'interface' from source: play vars 44071 1727204722.19548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204722.19740: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204722.19830: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204722.19895: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204722.19945: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204722.20035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204722.20078: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204722.20172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204722.20175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204722.20239: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204722.20603: variable 'network_connections' from source: include params 44071 1727204722.20616: variable 'interface' from source: play vars 44071 1727204722.20716: variable 'interface' from source: play vars 44071 1727204722.20787: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204722.20962: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204722.21195: variable 'network_connections' from source: include params 44071 1727204722.21205: variable 'interface' from source: play vars 44071 1727204722.21289: variable 'interface' from source: play vars 44071 1727204722.21326: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204722.21419: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204722.21771: variable 'network_connections' from source: include params 44071 1727204722.21782: variable 'interface' from source: play vars 44071 1727204722.21869: variable 'interface' from source: play vars 44071 1727204722.21940: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204722.22014: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204722.22027: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204722.22101: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204722.22372: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204722.22906: variable 'network_connections' from source: include params 44071 1727204722.22918: variable 'interface' from source: play vars 44071 1727204722.22998: variable 'interface' from source: play vars 44071 1727204722.23015: variable 'ansible_distribution' from source: facts 44071 1727204722.23024: variable '__network_rh_distros' from source: role '' defaults 44071 1727204722.23033: variable 'ansible_distribution_major_version' from source: facts 44071 1727204722.23073: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204722.23277: variable 'ansible_distribution' from source: facts 44071 1727204722.23372: variable '__network_rh_distros' from source: role '' defaults 44071 1727204722.23375: variable 'ansible_distribution_major_version' from source: facts 44071 1727204722.23379: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204722.23512: variable 'ansible_distribution' from source: facts 44071 1727204722.23522: variable '__network_rh_distros' from source: role '' defaults 44071 1727204722.23532: variable 'ansible_distribution_major_version' from source: facts 44071 1727204722.23577: variable 'network_provider' from source: set_fact 44071 1727204722.23615: variable 'omit' from source: magic vars 44071 1727204722.23652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204722.23698: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204722.23731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204722.23755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204722.23775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204722.23811: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204722.23827: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204722.23836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204722.24071: Set connection var ansible_connection to ssh 44071 1727204722.24075: Set connection var ansible_timeout to 10 44071 1727204722.24077: Set connection var ansible_pipelining to False 44071 1727204722.24080: Set connection var ansible_shell_type to sh 44071 1727204722.24082: Set connection var ansible_shell_executable to /bin/sh 44071 1727204722.24084: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204722.24086: variable 'ansible_shell_executable' from source: unknown 44071 1727204722.24088: variable 'ansible_connection' from source: unknown 44071 1727204722.24091: variable 'ansible_module_compression' from source: unknown 44071 1727204722.24093: variable 'ansible_shell_type' from source: unknown 44071 1727204722.24095: variable 'ansible_shell_executable' from source: unknown 44071 1727204722.24097: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204722.24099: variable 'ansible_pipelining' from source: unknown 44071 1727204722.24101: variable 'ansible_timeout' from source: unknown 44071 1727204722.24104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204722.24190: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204722.24222: variable 'omit' from source: magic vars 44071 1727204722.24233: starting attempt loop 44071 1727204722.24271: running the handler 44071 1727204722.24337: variable 'ansible_facts' from source: unknown 44071 1727204722.25334: _low_level_execute_command(): starting 44071 1727204722.25348: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204722.26208: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204722.26242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204722.26267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204722.26293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204722.26411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204722.28462: stdout chunk (state=3): >>>/root <<< 44071 1727204722.28485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204722.28685: stdout chunk (state=3): >>><<< 44071 1727204722.28691: stderr chunk (state=3): >>><<< 44071 1727204722.28695: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204722.28698: _low_level_execute_command(): starting 44071 1727204722.28701: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204722.2862048-51489-272286303415503 `" && echo ansible-tmp-1727204722.2862048-51489-272286303415503="` echo /root/.ansible/tmp/ansible-tmp-1727204722.2862048-51489-272286303415503 `" ) && sleep 0' 44071 1727204722.30135: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204722.30140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204722.30142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204722.30145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204722.30147: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204722.30150: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204722.30238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204722.30242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204722.30244: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204722.30246: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204722.30248: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204722.30250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204722.30252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204722.30254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204722.30256: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204722.30258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204722.30680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204722.30684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204722.32636: stdout chunk (state=3): >>>ansible-tmp-1727204722.2862048-51489-272286303415503=/root/.ansible/tmp/ansible-tmp-1727204722.2862048-51489-272286303415503 <<< 44071 1727204722.32741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204722.32839: stderr chunk (state=3): >>><<< 44071 1727204722.32843: stdout chunk (state=3): >>><<< 44071 1727204722.32871: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204722.2862048-51489-272286303415503=/root/.ansible/tmp/ansible-tmp-1727204722.2862048-51489-272286303415503 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204722.32905: variable 'ansible_module_compression' from source: unknown 44071 1727204722.32966: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44071 1727204722.33030: variable 'ansible_facts' from source: unknown 44071 1727204722.33377: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204722.2862048-51489-272286303415503/AnsiballZ_systemd.py 44071 1727204722.33553: Sending initial data 44071 1727204722.33557: Sent initial data (156 bytes) 44071 1727204722.34094: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204722.34138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204722.34142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204722.34181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204722.34217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204722.34279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204722.34282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204722.34505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204722.36105: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204722.36112: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 44071 1727204722.36129: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204722.36223: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204722.36326: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpd939i30o /root/.ansible/tmp/ansible-tmp-1727204722.2862048-51489-272286303415503/AnsiballZ_systemd.py <<< 44071 1727204722.36341: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204722.2862048-51489-272286303415503/AnsiballZ_systemd.py" <<< 44071 1727204722.36388: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 44071 1727204722.36414: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpd939i30o" to remote "/root/.ansible/tmp/ansible-tmp-1727204722.2862048-51489-272286303415503/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204722.2862048-51489-272286303415503/AnsiballZ_systemd.py" <<< 44071 1727204722.38688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204722.38692: stdout chunk (state=3): >>><<< 44071 1727204722.38694: stderr chunk (state=3): >>><<< 44071 1727204722.38709: done transferring module to remote 44071 1727204722.38723: _low_level_execute_command(): starting 44071 1727204722.38729: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204722.2862048-51489-272286303415503/ /root/.ansible/tmp/ansible-tmp-1727204722.2862048-51489-272286303415503/AnsiballZ_systemd.py && sleep 0' 44071 1727204722.39692: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204722.39706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204722.39773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204722.39778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204722.39855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204722.39871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204722.39932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204722.42218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204722.42222: stdout chunk (state=3): >>><<< 44071 1727204722.42225: stderr chunk (state=3): >>><<< 44071 1727204722.42228: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204722.42230: _low_level_execute_command(): starting 44071 1727204722.42232: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204722.2862048-51489-272286303415503/AnsiballZ_systemd.py && sleep 0' 44071 1727204722.43340: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204722.43359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204722.43380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204722.43398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204722.43418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204722.43486: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204722.43537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204722.43555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204722.43787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204722.43889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204722.75798: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4526080", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3513876480", "CPUUsageNSec": "1689498000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44071 1727204722.77763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204722.77770: stdout chunk (state=3): >>><<< 44071 1727204722.77773: stderr chunk (state=3): >>><<< 44071 1727204722.77794: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4526080", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3513876480", "CPUUsageNSec": "1689498000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204722.78235: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204722.2862048-51489-272286303415503/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204722.78759: _low_level_execute_command(): starting 44071 1727204722.78763: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204722.2862048-51489-272286303415503/ > /dev/null 2>&1 && sleep 0' 44071 1727204722.80077: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204722.80678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204722.80977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204722.81136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204722.83298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204722.83302: stdout chunk (state=3): >>><<< 44071 1727204722.83305: stderr chunk (state=3): >>><<< 44071 1727204722.83325: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204722.83333: handler run complete 44071 1727204722.83442: attempt loop complete, returning result 44071 1727204722.83446: _execute() done 44071 1727204722.83448: dumping result to json 44071 1727204722.83469: done dumping result, returning 44071 1727204722.83578: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-c964-7471-0000000021af] 44071 1727204722.83583: sending task result for task 127b8e07-fff9-c964-7471-0000000021af 44071 1727204722.84371: done sending task result for task 127b8e07-fff9-c964-7471-0000000021af 44071 1727204722.84377: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204722.84438: no more pending results, returning what we have 44071 1727204722.84440: results queue empty 44071 1727204722.84441: checking for any_errors_fatal 44071 1727204722.84447: done checking for any_errors_fatal 44071 1727204722.84448: checking for max_fail_percentage 44071 1727204722.84449: done checking for max_fail_percentage 44071 1727204722.84450: checking to see if all hosts have failed and the running result is not ok 44071 1727204722.84451: done checking to see if all hosts have failed 44071 1727204722.84451: getting the remaining hosts for this loop 44071 1727204722.84453: done getting the remaining hosts for this loop 44071 1727204722.84457: getting the next task for host managed-node2 44071 1727204722.84464: done getting next task for host managed-node2 44071 1727204722.84469: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204722.84474: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204722.84492: getting variables 44071 1727204722.84494: in VariableManager get_vars() 44071 1727204722.84534: Calling all_inventory to load vars for managed-node2 44071 1727204722.84537: Calling groups_inventory to load vars for managed-node2 44071 1727204722.84539: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204722.84550: Calling all_plugins_play to load vars for managed-node2 44071 1727204722.84553: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204722.84556: Calling groups_plugins_play to load vars for managed-node2 44071 1727204722.88185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204722.91040: done with get_vars() 44071 1727204722.91295: done getting variables 44071 1727204722.91364: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.828) 0:02:15.232 ***** 44071 1727204722.91618: entering _queue_task() for managed-node2/service 44071 1727204722.92234: worker is 1 (out of 1 available) 44071 1727204722.92249: exiting _queue_task() for managed-node2/service 44071 1727204722.92668: done queuing things up, now waiting for results queue to drain 44071 1727204722.92671: waiting for pending results... 44071 1727204722.93026: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204722.93392: in run() - task 127b8e07-fff9-c964-7471-0000000021b0 44071 1727204722.93411: variable 'ansible_search_path' from source: unknown 44071 1727204722.93415: variable 'ansible_search_path' from source: unknown 44071 1727204722.93463: calling self._execute() 44071 1727204722.93698: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204722.93702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204722.93884: variable 'omit' from source: magic vars 44071 1727204722.94773: variable 'ansible_distribution_major_version' from source: facts 44071 1727204722.94802: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204722.95022: variable 'network_provider' from source: set_fact 44071 1727204722.95026: Evaluated conditional (network_provider == "nm"): True 44071 1727204722.95252: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204722.95460: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204722.95849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204723.01222: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204723.01574: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204723.01592: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204723.01636: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204723.01869: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204723.02112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204723.02212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204723.02249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204723.02372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204723.02449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204723.02648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204723.02651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204723.02653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204723.02798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204723.02819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204723.03071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204723.03074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204723.03086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204723.03142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204723.03210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204723.03740: variable 'network_connections' from source: include params 44071 1727204723.03744: variable 'interface' from source: play vars 44071 1727204723.03869: variable 'interface' from source: play vars 44071 1727204723.04174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204723.04648: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204723.04703: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204723.04807: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204723.04963: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204723.05024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204723.05192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204723.05301: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204723.05474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204723.05479: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204723.06290: variable 'network_connections' from source: include params 44071 1727204723.06312: variable 'interface' from source: play vars 44071 1727204723.06545: variable 'interface' from source: play vars 44071 1727204723.06740: Evaluated conditional (__network_wpa_supplicant_required): False 44071 1727204723.06743: when evaluation is False, skipping this task 44071 1727204723.06745: _execute() done 44071 1727204723.06747: dumping result to json 44071 1727204723.06749: done dumping result, returning 44071 1727204723.06754: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-c964-7471-0000000021b0] 44071 1727204723.06776: sending task result for task 127b8e07-fff9-c964-7471-0000000021b0 44071 1727204723.06991: done sending task result for task 127b8e07-fff9-c964-7471-0000000021b0 44071 1727204723.06994: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44071 1727204723.07122: no more pending results, returning what we have 44071 1727204723.07126: results queue empty 44071 1727204723.07127: checking for any_errors_fatal 44071 1727204723.07160: done checking for any_errors_fatal 44071 1727204723.07161: checking for max_fail_percentage 44071 1727204723.07164: done checking for max_fail_percentage 44071 1727204723.07168: checking to see if all hosts have failed and the running result is not ok 44071 1727204723.07169: done checking to see if all hosts have failed 44071 1727204723.07170: getting the remaining hosts for this loop 44071 1727204723.07171: done getting the remaining hosts for this loop 44071 1727204723.07176: getting the next task for host managed-node2 44071 1727204723.07186: done getting next task for host managed-node2 44071 1727204723.07190: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204723.07195: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204723.07216: getting variables 44071 1727204723.07218: in VariableManager get_vars() 44071 1727204723.07411: Calling all_inventory to load vars for managed-node2 44071 1727204723.07415: Calling groups_inventory to load vars for managed-node2 44071 1727204723.07418: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204723.07430: Calling all_plugins_play to load vars for managed-node2 44071 1727204723.07433: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204723.07437: Calling groups_plugins_play to load vars for managed-node2 44071 1727204723.12563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204723.17136: done with get_vars() 44071 1727204723.17189: done getting variables 44071 1727204723.17288: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.257) 0:02:15.489 ***** 44071 1727204723.17333: entering _queue_task() for managed-node2/service 44071 1727204723.17980: worker is 1 (out of 1 available) 44071 1727204723.17993: exiting _queue_task() for managed-node2/service 44071 1727204723.18007: done queuing things up, now waiting for results queue to drain 44071 1727204723.18009: waiting for pending results... 44071 1727204723.18291: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204723.18305: in run() - task 127b8e07-fff9-c964-7471-0000000021b1 44071 1727204723.18503: variable 'ansible_search_path' from source: unknown 44071 1727204723.18507: variable 'ansible_search_path' from source: unknown 44071 1727204723.18511: calling self._execute() 44071 1727204723.18514: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204723.18612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204723.18616: variable 'omit' from source: magic vars 44071 1727204723.19084: variable 'ansible_distribution_major_version' from source: facts 44071 1727204723.19096: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204723.19550: variable 'network_provider' from source: set_fact 44071 1727204723.19557: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204723.19562: when evaluation is False, skipping this task 44071 1727204723.19567: _execute() done 44071 1727204723.19570: dumping result to json 44071 1727204723.19574: done dumping result, returning 44071 1727204723.19583: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-c964-7471-0000000021b1] 44071 1727204723.19591: sending task result for task 127b8e07-fff9-c964-7471-0000000021b1 44071 1727204723.19830: done sending task result for task 127b8e07-fff9-c964-7471-0000000021b1 44071 1727204723.19835: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204723.20023: no more pending results, returning what we have 44071 1727204723.20027: results queue empty 44071 1727204723.20028: checking for any_errors_fatal 44071 1727204723.20041: done checking for any_errors_fatal 44071 1727204723.20042: checking for max_fail_percentage 44071 1727204723.20044: done checking for max_fail_percentage 44071 1727204723.20045: checking to see if all hosts have failed and the running result is not ok 44071 1727204723.20046: done checking to see if all hosts have failed 44071 1727204723.20046: getting the remaining hosts for this loop 44071 1727204723.20048: done getting the remaining hosts for this loop 44071 1727204723.20058: getting the next task for host managed-node2 44071 1727204723.20071: done getting next task for host managed-node2 44071 1727204723.20079: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204723.20085: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204723.20112: getting variables 44071 1727204723.20114: in VariableManager get_vars() 44071 1727204723.20472: Calling all_inventory to load vars for managed-node2 44071 1727204723.20476: Calling groups_inventory to load vars for managed-node2 44071 1727204723.20479: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204723.20491: Calling all_plugins_play to load vars for managed-node2 44071 1727204723.20494: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204723.20497: Calling groups_plugins_play to load vars for managed-node2 44071 1727204723.24369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204723.27549: done with get_vars() 44071 1727204723.27601: done getting variables 44071 1727204723.27675: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.103) 0:02:15.593 ***** 44071 1727204723.27722: entering _queue_task() for managed-node2/copy 44071 1727204723.28349: worker is 1 (out of 1 available) 44071 1727204723.28369: exiting _queue_task() for managed-node2/copy 44071 1727204723.28382: done queuing things up, now waiting for results queue to drain 44071 1727204723.28384: waiting for pending results... 44071 1727204723.28789: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204723.28901: in run() - task 127b8e07-fff9-c964-7471-0000000021b2 44071 1727204723.28931: variable 'ansible_search_path' from source: unknown 44071 1727204723.29011: variable 'ansible_search_path' from source: unknown 44071 1727204723.29020: calling self._execute() 44071 1727204723.29137: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204723.29152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204723.29168: variable 'omit' from source: magic vars 44071 1727204723.29639: variable 'ansible_distribution_major_version' from source: facts 44071 1727204723.29662: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204723.29809: variable 'network_provider' from source: set_fact 44071 1727204723.29822: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204723.29884: when evaluation is False, skipping this task 44071 1727204723.29890: _execute() done 44071 1727204723.29893: dumping result to json 44071 1727204723.29895: done dumping result, returning 44071 1727204723.29900: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-c964-7471-0000000021b2] 44071 1727204723.29902: sending task result for task 127b8e07-fff9-c964-7471-0000000021b2 44071 1727204723.30095: done sending task result for task 127b8e07-fff9-c964-7471-0000000021b2 44071 1727204723.30100: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44071 1727204723.30163: no more pending results, returning what we have 44071 1727204723.30169: results queue empty 44071 1727204723.30171: checking for any_errors_fatal 44071 1727204723.30180: done checking for any_errors_fatal 44071 1727204723.30181: checking for max_fail_percentage 44071 1727204723.30183: done checking for max_fail_percentage 44071 1727204723.30184: checking to see if all hosts have failed and the running result is not ok 44071 1727204723.30185: done checking to see if all hosts have failed 44071 1727204723.30186: getting the remaining hosts for this loop 44071 1727204723.30187: done getting the remaining hosts for this loop 44071 1727204723.30193: getting the next task for host managed-node2 44071 1727204723.30204: done getting next task for host managed-node2 44071 1727204723.30209: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204723.30215: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204723.30244: getting variables 44071 1727204723.30247: in VariableManager get_vars() 44071 1727204723.30524: Calling all_inventory to load vars for managed-node2 44071 1727204723.30528: Calling groups_inventory to load vars for managed-node2 44071 1727204723.30530: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204723.30550: Calling all_plugins_play to load vars for managed-node2 44071 1727204723.30553: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204723.30556: Calling groups_plugins_play to load vars for managed-node2 44071 1727204723.32807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204723.35909: done with get_vars() 44071 1727204723.36071: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.085) 0:02:15.679 ***** 44071 1727204723.36286: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204723.37101: worker is 1 (out of 1 available) 44071 1727204723.37230: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204723.37247: done queuing things up, now waiting for results queue to drain 44071 1727204723.37249: waiting for pending results... 44071 1727204723.37887: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204723.38059: in run() - task 127b8e07-fff9-c964-7471-0000000021b3 44071 1727204723.38078: variable 'ansible_search_path' from source: unknown 44071 1727204723.38081: variable 'ansible_search_path' from source: unknown 44071 1727204723.38122: calling self._execute() 44071 1727204723.38459: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204723.38463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204723.38469: variable 'omit' from source: magic vars 44071 1727204723.39576: variable 'ansible_distribution_major_version' from source: facts 44071 1727204723.39580: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204723.39583: variable 'omit' from source: magic vars 44071 1727204723.39586: variable 'omit' from source: magic vars 44071 1727204723.39775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204723.42641: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204723.42821: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204723.42863: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204723.42901: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204723.42930: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204723.43335: variable 'network_provider' from source: set_fact 44071 1727204723.43686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204723.43717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204723.43744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204723.43789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204723.43808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204723.43995: variable 'omit' from source: magic vars 44071 1727204723.44122: variable 'omit' from source: magic vars 44071 1727204723.44710: variable 'network_connections' from source: include params 44071 1727204723.44714: variable 'interface' from source: play vars 44071 1727204723.44717: variable 'interface' from source: play vars 44071 1727204723.44975: variable 'omit' from source: magic vars 44071 1727204723.44979: variable '__lsr_ansible_managed' from source: task vars 44071 1727204723.44981: variable '__lsr_ansible_managed' from source: task vars 44071 1727204723.45232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44071 1727204723.45509: Loaded config def from plugin (lookup/template) 44071 1727204723.45526: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44071 1727204723.45561: File lookup term: get_ansible_managed.j2 44071 1727204723.45571: variable 'ansible_search_path' from source: unknown 44071 1727204723.45583: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44071 1727204723.45602: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44071 1727204723.45632: variable 'ansible_search_path' from source: unknown 44071 1727204723.60410: variable 'ansible_managed' from source: unknown 44071 1727204723.60726: variable 'omit' from source: magic vars 44071 1727204723.60761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204723.60792: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204723.60972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204723.60976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204723.60979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204723.61188: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204723.61192: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204723.61195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204723.61299: Set connection var ansible_connection to ssh 44071 1727204723.61305: Set connection var ansible_timeout to 10 44071 1727204723.61312: Set connection var ansible_pipelining to False 44071 1727204723.61317: Set connection var ansible_shell_type to sh 44071 1727204723.61324: Set connection var ansible_shell_executable to /bin/sh 44071 1727204723.61332: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204723.61485: variable 'ansible_shell_executable' from source: unknown 44071 1727204723.61488: variable 'ansible_connection' from source: unknown 44071 1727204723.61491: variable 'ansible_module_compression' from source: unknown 44071 1727204723.61494: variable 'ansible_shell_type' from source: unknown 44071 1727204723.61496: variable 'ansible_shell_executable' from source: unknown 44071 1727204723.61499: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204723.61504: variable 'ansible_pipelining' from source: unknown 44071 1727204723.61507: variable 'ansible_timeout' from source: unknown 44071 1727204723.61512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204723.61811: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204723.61823: variable 'omit' from source: magic vars 44071 1727204723.61889: starting attempt loop 44071 1727204723.61892: running the handler 44071 1727204723.61895: _low_level_execute_command(): starting 44071 1727204723.61897: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204723.63639: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204723.63645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204723.63714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204723.63831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204723.64004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204723.65877: stdout chunk (state=3): >>>/root <<< 44071 1727204723.65880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204723.65971: stderr chunk (state=3): >>><<< 44071 1727204723.65975: stdout chunk (state=3): >>><<< 44071 1727204723.66081: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204723.66100: _low_level_execute_command(): starting 44071 1727204723.66107: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204723.6608121-51627-79686329341052 `" && echo ansible-tmp-1727204723.6608121-51627-79686329341052="` echo /root/.ansible/tmp/ansible-tmp-1727204723.6608121-51627-79686329341052 `" ) && sleep 0' 44071 1727204723.67776: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204723.67781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204723.67990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204723.68188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204723.68193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204723.68196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204723.68423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204723.70203: stdout chunk (state=3): >>>ansible-tmp-1727204723.6608121-51627-79686329341052=/root/.ansible/tmp/ansible-tmp-1727204723.6608121-51627-79686329341052 <<< 44071 1727204723.70575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204723.70580: stdout chunk (state=3): >>><<< 44071 1727204723.70582: stderr chunk (state=3): >>><<< 44071 1727204723.70636: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204723.6608121-51627-79686329341052=/root/.ansible/tmp/ansible-tmp-1727204723.6608121-51627-79686329341052 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204723.70872: variable 'ansible_module_compression' from source: unknown 44071 1727204723.70876: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44071 1727204723.70896: variable 'ansible_facts' from source: unknown 44071 1727204723.71109: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204723.6608121-51627-79686329341052/AnsiballZ_network_connections.py 44071 1727204723.71905: Sending initial data 44071 1727204723.71910: Sent initial data (167 bytes) 44071 1727204723.73678: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204723.73689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204723.73996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204723.74099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204723.74343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204723.76076: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204723.76242: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204723.76282: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp73kjsl3s /root/.ansible/tmp/ansible-tmp-1727204723.6608121-51627-79686329341052/AnsiballZ_network_connections.py <<< 44071 1727204723.76286: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204723.6608121-51627-79686329341052/AnsiballZ_network_connections.py" <<< 44071 1727204723.77025: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp73kjsl3s" to remote "/root/.ansible/tmp/ansible-tmp-1727204723.6608121-51627-79686329341052/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204723.6608121-51627-79686329341052/AnsiballZ_network_connections.py" <<< 44071 1727204723.79675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204723.79720: stderr chunk (state=3): >>><<< 44071 1727204723.79724: stdout chunk (state=3): >>><<< 44071 1727204723.79764: done transferring module to remote 44071 1727204723.79778: _low_level_execute_command(): starting 44071 1727204723.79783: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204723.6608121-51627-79686329341052/ /root/.ansible/tmp/ansible-tmp-1727204723.6608121-51627-79686329341052/AnsiballZ_network_connections.py && sleep 0' 44071 1727204723.82104: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204723.82427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204723.82912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204723.82983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204723.85051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204723.85055: stdout chunk (state=3): >>><<< 44071 1727204723.85058: stderr chunk (state=3): >>><<< 44071 1727204723.85061: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204723.85063: _low_level_execute_command(): starting 44071 1727204723.85069: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204723.6608121-51627-79686329341052/AnsiballZ_network_connections.py && sleep 0' 44071 1727204723.86286: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204723.86352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204723.86473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204723.86478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204723.86549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204723.86670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204723.86691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204723.86793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204724.16719: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44071 1727204724.21427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204724.21432: stdout chunk (state=3): >>><<< 44071 1727204724.21441: stderr chunk (state=3): >>><<< 44071 1727204724.21483: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204724.21530: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204723.6608121-51627-79686329341052/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204724.21543: _low_level_execute_command(): starting 44071 1727204724.21552: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204723.6608121-51627-79686329341052/ > /dev/null 2>&1 && sleep 0' 44071 1727204724.22288: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204724.22302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204724.22316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204724.22634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204724.25775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204724.25779: stdout chunk (state=3): >>><<< 44071 1727204724.25782: stderr chunk (state=3): >>><<< 44071 1727204724.25786: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204724.25789: handler run complete 44071 1727204724.25791: attempt loop complete, returning result 44071 1727204724.25794: _execute() done 44071 1727204724.25797: dumping result to json 44071 1727204724.25799: done dumping result, returning 44071 1727204724.25802: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-c964-7471-0000000021b3] 44071 1727204724.25805: sending task result for task 127b8e07-fff9-c964-7471-0000000021b3 44071 1727204724.25901: done sending task result for task 127b8e07-fff9-c964-7471-0000000021b3 44071 1727204724.25906: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01 44071 1727204724.26024: no more pending results, returning what we have 44071 1727204724.26028: results queue empty 44071 1727204724.26029: checking for any_errors_fatal 44071 1727204724.26042: done checking for any_errors_fatal 44071 1727204724.26047: checking for max_fail_percentage 44071 1727204724.26049: done checking for max_fail_percentage 44071 1727204724.26050: checking to see if all hosts have failed and the running result is not ok 44071 1727204724.26051: done checking to see if all hosts have failed 44071 1727204724.26051: getting the remaining hosts for this loop 44071 1727204724.26053: done getting the remaining hosts for this loop 44071 1727204724.26058: getting the next task for host managed-node2 44071 1727204724.26289: done getting next task for host managed-node2 44071 1727204724.26294: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204724.26300: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204724.26313: getting variables 44071 1727204724.26315: in VariableManager get_vars() 44071 1727204724.26368: Calling all_inventory to load vars for managed-node2 44071 1727204724.26372: Calling groups_inventory to load vars for managed-node2 44071 1727204724.26374: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204724.26392: Calling all_plugins_play to load vars for managed-node2 44071 1727204724.26396: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204724.26399: Calling groups_plugins_play to load vars for managed-node2 44071 1727204724.28900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204724.31828: done with get_vars() 44071 1727204724.32084: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.959) 0:02:16.638 ***** 44071 1727204724.32208: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204724.32637: worker is 1 (out of 1 available) 44071 1727204724.32652: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204724.32869: done queuing things up, now waiting for results queue to drain 44071 1727204724.32872: waiting for pending results... 44071 1727204724.33033: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204724.33213: in run() - task 127b8e07-fff9-c964-7471-0000000021b4 44071 1727204724.33238: variable 'ansible_search_path' from source: unknown 44071 1727204724.33244: variable 'ansible_search_path' from source: unknown 44071 1727204724.33290: calling self._execute() 44071 1727204724.33415: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204724.33472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204724.33476: variable 'omit' from source: magic vars 44071 1727204724.33895: variable 'ansible_distribution_major_version' from source: facts 44071 1727204724.33914: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204724.34055: variable 'network_state' from source: role '' defaults 44071 1727204724.34074: Evaluated conditional (network_state != {}): False 44071 1727204724.34086: when evaluation is False, skipping this task 44071 1727204724.34092: _execute() done 44071 1727204724.34100: dumping result to json 44071 1727204724.34107: done dumping result, returning 44071 1727204724.34192: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-c964-7471-0000000021b4] 44071 1727204724.34195: sending task result for task 127b8e07-fff9-c964-7471-0000000021b4 44071 1727204724.34281: done sending task result for task 127b8e07-fff9-c964-7471-0000000021b4 44071 1727204724.34284: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204724.34358: no more pending results, returning what we have 44071 1727204724.34362: results queue empty 44071 1727204724.34363: checking for any_errors_fatal 44071 1727204724.34380: done checking for any_errors_fatal 44071 1727204724.34381: checking for max_fail_percentage 44071 1727204724.34383: done checking for max_fail_percentage 44071 1727204724.34384: checking to see if all hosts have failed and the running result is not ok 44071 1727204724.34385: done checking to see if all hosts have failed 44071 1727204724.34386: getting the remaining hosts for this loop 44071 1727204724.34388: done getting the remaining hosts for this loop 44071 1727204724.34394: getting the next task for host managed-node2 44071 1727204724.34404: done getting next task for host managed-node2 44071 1727204724.34410: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204724.34417: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204724.34444: getting variables 44071 1727204724.34446: in VariableManager get_vars() 44071 1727204724.34804: Calling all_inventory to load vars for managed-node2 44071 1727204724.34808: Calling groups_inventory to load vars for managed-node2 44071 1727204724.34811: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204724.34823: Calling all_plugins_play to load vars for managed-node2 44071 1727204724.34826: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204724.34830: Calling groups_plugins_play to load vars for managed-node2 44071 1727204724.36657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204724.38946: done with get_vars() 44071 1727204724.38979: done getting variables 44071 1727204724.39047: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.068) 0:02:16.707 ***** 44071 1727204724.39093: entering _queue_task() for managed-node2/debug 44071 1727204724.39544: worker is 1 (out of 1 available) 44071 1727204724.39558: exiting _queue_task() for managed-node2/debug 44071 1727204724.39575: done queuing things up, now waiting for results queue to drain 44071 1727204724.39576: waiting for pending results... 44071 1727204724.39960: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204724.40155: in run() - task 127b8e07-fff9-c964-7471-0000000021b5 44071 1727204724.40184: variable 'ansible_search_path' from source: unknown 44071 1727204724.40193: variable 'ansible_search_path' from source: unknown 44071 1727204724.40241: calling self._execute() 44071 1727204724.40361: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204724.40378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204724.40393: variable 'omit' from source: magic vars 44071 1727204724.40852: variable 'ansible_distribution_major_version' from source: facts 44071 1727204724.40929: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204724.40934: variable 'omit' from source: magic vars 44071 1727204724.40955: variable 'omit' from source: magic vars 44071 1727204724.41000: variable 'omit' from source: magic vars 44071 1727204724.41076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204724.41121: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204724.41151: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204724.41183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204724.41202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204724.41262: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204724.41371: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204724.41374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204724.41416: Set connection var ansible_connection to ssh 44071 1727204724.41429: Set connection var ansible_timeout to 10 44071 1727204724.41442: Set connection var ansible_pipelining to False 44071 1727204724.41451: Set connection var ansible_shell_type to sh 44071 1727204724.41460: Set connection var ansible_shell_executable to /bin/sh 44071 1727204724.41474: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204724.41508: variable 'ansible_shell_executable' from source: unknown 44071 1727204724.41515: variable 'ansible_connection' from source: unknown 44071 1727204724.41522: variable 'ansible_module_compression' from source: unknown 44071 1727204724.41529: variable 'ansible_shell_type' from source: unknown 44071 1727204724.41539: variable 'ansible_shell_executable' from source: unknown 44071 1727204724.41546: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204724.41575: variable 'ansible_pipelining' from source: unknown 44071 1727204724.41582: variable 'ansible_timeout' from source: unknown 44071 1727204724.41589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204724.41745: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204724.41770: variable 'omit' from source: magic vars 44071 1727204724.41794: starting attempt loop 44071 1727204724.41800: running the handler 44071 1727204724.41961: variable '__network_connections_result' from source: set_fact 44071 1727204724.42051: handler run complete 44071 1727204724.42142: attempt loop complete, returning result 44071 1727204724.42145: _execute() done 44071 1727204724.42148: dumping result to json 44071 1727204724.42150: done dumping result, returning 44071 1727204724.42153: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-c964-7471-0000000021b5] 44071 1727204724.42155: sending task result for task 127b8e07-fff9-c964-7471-0000000021b5 44071 1727204724.42239: done sending task result for task 127b8e07-fff9-c964-7471-0000000021b5 44071 1727204724.42242: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01" ] } 44071 1727204724.42349: no more pending results, returning what we have 44071 1727204724.42352: results queue empty 44071 1727204724.42353: checking for any_errors_fatal 44071 1727204724.42364: done checking for any_errors_fatal 44071 1727204724.42366: checking for max_fail_percentage 44071 1727204724.42368: done checking for max_fail_percentage 44071 1727204724.42369: checking to see if all hosts have failed and the running result is not ok 44071 1727204724.42370: done checking to see if all hosts have failed 44071 1727204724.42371: getting the remaining hosts for this loop 44071 1727204724.42372: done getting the remaining hosts for this loop 44071 1727204724.42377: getting the next task for host managed-node2 44071 1727204724.42386: done getting next task for host managed-node2 44071 1727204724.42391: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204724.42398: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204724.42412: getting variables 44071 1727204724.42414: in VariableManager get_vars() 44071 1727204724.42776: Calling all_inventory to load vars for managed-node2 44071 1727204724.42780: Calling groups_inventory to load vars for managed-node2 44071 1727204724.42782: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204724.42794: Calling all_plugins_play to load vars for managed-node2 44071 1727204724.42798: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204724.42801: Calling groups_plugins_play to load vars for managed-node2 44071 1727204724.46543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204724.51547: done with get_vars() 44071 1727204724.51600: done getting variables 44071 1727204724.51876: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.128) 0:02:16.835 ***** 44071 1727204724.51924: entering _queue_task() for managed-node2/debug 44071 1727204724.52772: worker is 1 (out of 1 available) 44071 1727204724.52786: exiting _queue_task() for managed-node2/debug 44071 1727204724.52801: done queuing things up, now waiting for results queue to drain 44071 1727204724.52803: waiting for pending results... 44071 1727204724.53717: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204724.54075: in run() - task 127b8e07-fff9-c964-7471-0000000021b6 44071 1727204724.54079: variable 'ansible_search_path' from source: unknown 44071 1727204724.54084: variable 'ansible_search_path' from source: unknown 44071 1727204724.54298: calling self._execute() 44071 1727204724.54676: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204724.54681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204724.54684: variable 'omit' from source: magic vars 44071 1727204724.55485: variable 'ansible_distribution_major_version' from source: facts 44071 1727204724.55509: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204724.55561: variable 'omit' from source: magic vars 44071 1727204724.55769: variable 'omit' from source: magic vars 44071 1727204724.55824: variable 'omit' from source: magic vars 44071 1727204724.56377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204724.56382: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204724.56384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204724.56387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204724.56398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204724.56441: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204724.56452: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204724.56491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204724.56760: Set connection var ansible_connection to ssh 44071 1727204724.56781: Set connection var ansible_timeout to 10 44071 1727204724.56818: Set connection var ansible_pipelining to False 44071 1727204724.56830: Set connection var ansible_shell_type to sh 44071 1727204724.56927: Set connection var ansible_shell_executable to /bin/sh 44071 1727204724.56945: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204724.56984: variable 'ansible_shell_executable' from source: unknown 44071 1727204724.56997: variable 'ansible_connection' from source: unknown 44071 1727204724.57005: variable 'ansible_module_compression' from source: unknown 44071 1727204724.57011: variable 'ansible_shell_type' from source: unknown 44071 1727204724.57020: variable 'ansible_shell_executable' from source: unknown 44071 1727204724.57030: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204724.57174: variable 'ansible_pipelining' from source: unknown 44071 1727204724.57177: variable 'ansible_timeout' from source: unknown 44071 1727204724.57180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204724.57450: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204724.57516: variable 'omit' from source: magic vars 44071 1727204724.57527: starting attempt loop 44071 1727204724.57538: running the handler 44071 1727204724.57644: variable '__network_connections_result' from source: set_fact 44071 1727204724.57939: variable '__network_connections_result' from source: set_fact 44071 1727204724.58194: handler run complete 44071 1727204724.58302: attempt loop complete, returning result 44071 1727204724.58521: _execute() done 44071 1727204724.58529: dumping result to json 44071 1727204724.58542: done dumping result, returning 44071 1727204724.58558: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-c964-7471-0000000021b6] 44071 1727204724.58570: sending task result for task 127b8e07-fff9-c964-7471-0000000021b6 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01" ] } } 44071 1727204724.58984: no more pending results, returning what we have 44071 1727204724.58989: results queue empty 44071 1727204724.58989: checking for any_errors_fatal 44071 1727204724.58998: done checking for any_errors_fatal 44071 1727204724.58999: checking for max_fail_percentage 44071 1727204724.59000: done checking for max_fail_percentage 44071 1727204724.59002: checking to see if all hosts have failed and the running result is not ok 44071 1727204724.59002: done checking to see if all hosts have failed 44071 1727204724.59003: getting the remaining hosts for this loop 44071 1727204724.59005: done getting the remaining hosts for this loop 44071 1727204724.59011: getting the next task for host managed-node2 44071 1727204724.59021: done getting next task for host managed-node2 44071 1727204724.59026: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204724.59035: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204724.59051: getting variables 44071 1727204724.59053: in VariableManager get_vars() 44071 1727204724.59511: Calling all_inventory to load vars for managed-node2 44071 1727204724.59514: Calling groups_inventory to load vars for managed-node2 44071 1727204724.59525: done sending task result for task 127b8e07-fff9-c964-7471-0000000021b6 44071 1727204724.59538: WORKER PROCESS EXITING 44071 1727204724.59531: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204724.59552: Calling all_plugins_play to load vars for managed-node2 44071 1727204724.59555: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204724.59558: Calling groups_plugins_play to load vars for managed-node2 44071 1727204724.80370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204724.82709: done with get_vars() 44071 1727204724.82756: done getting variables 44071 1727204724.82819: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.309) 0:02:17.145 ***** 44071 1727204724.82858: entering _queue_task() for managed-node2/debug 44071 1727204724.83502: worker is 1 (out of 1 available) 44071 1727204724.83514: exiting _queue_task() for managed-node2/debug 44071 1727204724.83526: done queuing things up, now waiting for results queue to drain 44071 1727204724.83527: waiting for pending results... 44071 1727204724.83690: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204724.83919: in run() - task 127b8e07-fff9-c964-7471-0000000021b7 44071 1727204724.83948: variable 'ansible_search_path' from source: unknown 44071 1727204724.83959: variable 'ansible_search_path' from source: unknown 44071 1727204724.84014: calling self._execute() 44071 1727204724.84143: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204724.84160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204724.84180: variable 'omit' from source: magic vars 44071 1727204724.85274: variable 'ansible_distribution_major_version' from source: facts 44071 1727204724.85279: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204724.85572: variable 'network_state' from source: role '' defaults 44071 1727204724.85577: Evaluated conditional (network_state != {}): False 44071 1727204724.85579: when evaluation is False, skipping this task 44071 1727204724.85582: _execute() done 44071 1727204724.85585: dumping result to json 44071 1727204724.85591: done dumping result, returning 44071 1727204724.85594: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-c964-7471-0000000021b7] 44071 1727204724.85597: sending task result for task 127b8e07-fff9-c964-7471-0000000021b7 44071 1727204724.86311: done sending task result for task 127b8e07-fff9-c964-7471-0000000021b7 44071 1727204724.86316: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 44071 1727204724.86373: no more pending results, returning what we have 44071 1727204724.86377: results queue empty 44071 1727204724.86378: checking for any_errors_fatal 44071 1727204724.86388: done checking for any_errors_fatal 44071 1727204724.86389: checking for max_fail_percentage 44071 1727204724.86390: done checking for max_fail_percentage 44071 1727204724.86391: checking to see if all hosts have failed and the running result is not ok 44071 1727204724.86392: done checking to see if all hosts have failed 44071 1727204724.86393: getting the remaining hosts for this loop 44071 1727204724.86395: done getting the remaining hosts for this loop 44071 1727204724.86399: getting the next task for host managed-node2 44071 1727204724.86408: done getting next task for host managed-node2 44071 1727204724.86413: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204724.86419: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204724.86443: getting variables 44071 1727204724.86445: in VariableManager get_vars() 44071 1727204724.86495: Calling all_inventory to load vars for managed-node2 44071 1727204724.86498: Calling groups_inventory to load vars for managed-node2 44071 1727204724.86501: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204724.86515: Calling all_plugins_play to load vars for managed-node2 44071 1727204724.86518: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204724.86522: Calling groups_plugins_play to load vars for managed-node2 44071 1727204724.89826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204724.91996: done with get_vars() 44071 1727204724.92044: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.092) 0:02:17.237 ***** 44071 1727204724.92155: entering _queue_task() for managed-node2/ping 44071 1727204724.93102: worker is 1 (out of 1 available) 44071 1727204724.93122: exiting _queue_task() for managed-node2/ping 44071 1727204724.93141: done queuing things up, now waiting for results queue to drain 44071 1727204724.93143: waiting for pending results... 44071 1727204724.94032: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204724.94536: in run() - task 127b8e07-fff9-c964-7471-0000000021b8 44071 1727204724.94561: variable 'ansible_search_path' from source: unknown 44071 1727204724.94589: variable 'ansible_search_path' from source: unknown 44071 1727204724.94619: calling self._execute() 44071 1727204724.94773: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204724.94777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204724.94789: variable 'omit' from source: magic vars 44071 1727204724.95373: variable 'ansible_distribution_major_version' from source: facts 44071 1727204724.95377: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204724.95380: variable 'omit' from source: magic vars 44071 1727204724.95383: variable 'omit' from source: magic vars 44071 1727204724.95423: variable 'omit' from source: magic vars 44071 1727204724.95478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204724.95524: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204724.95554: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204724.95581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204724.95604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204724.95645: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204724.95656: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204724.95664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204724.95811: Set connection var ansible_connection to ssh 44071 1727204724.95815: Set connection var ansible_timeout to 10 44071 1727204724.95818: Set connection var ansible_pipelining to False 44071 1727204724.95820: Set connection var ansible_shell_type to sh 44071 1727204724.95830: Set connection var ansible_shell_executable to /bin/sh 44071 1727204724.95846: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204724.95920: variable 'ansible_shell_executable' from source: unknown 44071 1727204724.95923: variable 'ansible_connection' from source: unknown 44071 1727204724.95926: variable 'ansible_module_compression' from source: unknown 44071 1727204724.95929: variable 'ansible_shell_type' from source: unknown 44071 1727204724.95931: variable 'ansible_shell_executable' from source: unknown 44071 1727204724.95935: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204724.95938: variable 'ansible_pipelining' from source: unknown 44071 1727204724.95939: variable 'ansible_timeout' from source: unknown 44071 1727204724.95941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204724.96161: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204724.96180: variable 'omit' from source: magic vars 44071 1727204724.96248: starting attempt loop 44071 1727204724.96251: running the handler 44071 1727204724.96253: _low_level_execute_command(): starting 44071 1727204724.96255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204724.97445: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204724.97686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204724.97690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204724.97718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204724.97763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204724.97839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204724.99632: stdout chunk (state=3): >>>/root <<< 44071 1727204724.99970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204724.99974: stdout chunk (state=3): >>><<< 44071 1727204724.99976: stderr chunk (state=3): >>><<< 44071 1727204724.99979: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204724.99982: _low_level_execute_command(): starting 44071 1727204724.99986: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204724.9985785-51861-257254151068903 `" && echo ansible-tmp-1727204724.9985785-51861-257254151068903="` echo /root/.ansible/tmp/ansible-tmp-1727204724.9985785-51861-257254151068903 `" ) && sleep 0' 44071 1727204725.00844: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204725.00862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204725.00880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204725.00931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204725.01012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204725.01042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204725.01068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204725.01170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204725.03187: stdout chunk (state=3): >>>ansible-tmp-1727204724.9985785-51861-257254151068903=/root/.ansible/tmp/ansible-tmp-1727204724.9985785-51861-257254151068903 <<< 44071 1727204725.03428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204725.03432: stdout chunk (state=3): >>><<< 44071 1727204725.03434: stderr chunk (state=3): >>><<< 44071 1727204725.03437: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204724.9985785-51861-257254151068903=/root/.ansible/tmp/ansible-tmp-1727204724.9985785-51861-257254151068903 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204725.03675: variable 'ansible_module_compression' from source: unknown 44071 1727204725.03679: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44071 1727204725.03727: variable 'ansible_facts' from source: unknown 44071 1727204725.03845: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204724.9985785-51861-257254151068903/AnsiballZ_ping.py 44071 1727204725.04238: Sending initial data 44071 1727204725.04248: Sent initial data (153 bytes) 44071 1727204725.05770: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204725.05883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204725.05917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204725.05987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204725.07660: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204725.07724: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204725.07855: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpqp0bv7rf /root/.ansible/tmp/ansible-tmp-1727204724.9985785-51861-257254151068903/AnsiballZ_ping.py <<< 44071 1727204725.07859: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204724.9985785-51861-257254151068903/AnsiballZ_ping.py" <<< 44071 1727204725.07976: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpqp0bv7rf" to remote "/root/.ansible/tmp/ansible-tmp-1727204724.9985785-51861-257254151068903/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204724.9985785-51861-257254151068903/AnsiballZ_ping.py" <<< 44071 1727204725.09981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204725.09985: stdout chunk (state=3): >>><<< 44071 1727204725.10116: stderr chunk (state=3): >>><<< 44071 1727204725.10121: done transferring module to remote 44071 1727204725.10123: _low_level_execute_command(): starting 44071 1727204725.10126: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204724.9985785-51861-257254151068903/ /root/.ansible/tmp/ansible-tmp-1727204724.9985785-51861-257254151068903/AnsiballZ_ping.py && sleep 0' 44071 1727204725.11191: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204725.11349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204725.11411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204725.11483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204725.11595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204725.13625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204725.13630: stdout chunk (state=3): >>><<< 44071 1727204725.13633: stderr chunk (state=3): >>><<< 44071 1727204725.13673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204725.13681: _low_level_execute_command(): starting 44071 1727204725.13684: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204724.9985785-51861-257254151068903/AnsiballZ_ping.py && sleep 0' 44071 1727204725.14976: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204725.15217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204725.15221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204725.15224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204725.15227: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204725.15229: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204725.15231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204725.15233: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204725.15235: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204725.15239: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204725.15434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204725.15759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204725.32535: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44071 1727204725.33921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204725.33928: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 44071 1727204725.34104: stderr chunk (state=3): >>><<< 44071 1727204725.34109: stdout chunk (state=3): >>><<< 44071 1727204725.34112: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204725.34115: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204724.9985785-51861-257254151068903/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204725.34118: _low_level_execute_command(): starting 44071 1727204725.34121: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204724.9985785-51861-257254151068903/ > /dev/null 2>&1 && sleep 0' 44071 1727204725.34704: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204725.34708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204725.34720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204725.34736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204725.34756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204725.34760: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204725.34816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204725.34819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204725.34822: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204725.34825: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204725.34827: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204725.34830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204725.34833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204725.34835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204725.34837: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204725.34857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204725.34950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204725.34954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204725.34983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204725.35081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204725.37171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204725.37175: stdout chunk (state=3): >>><<< 44071 1727204725.37274: stderr chunk (state=3): >>><<< 44071 1727204725.37278: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204725.37286: handler run complete 44071 1727204725.37289: attempt loop complete, returning result 44071 1727204725.37291: _execute() done 44071 1727204725.37293: dumping result to json 44071 1727204725.37295: done dumping result, returning 44071 1727204725.37297: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-c964-7471-0000000021b8] 44071 1727204725.37300: sending task result for task 127b8e07-fff9-c964-7471-0000000021b8 44071 1727204725.37479: done sending task result for task 127b8e07-fff9-c964-7471-0000000021b8 44071 1727204725.37482: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 44071 1727204725.37549: no more pending results, returning what we have 44071 1727204725.37552: results queue empty 44071 1727204725.37553: checking for any_errors_fatal 44071 1727204725.37558: done checking for any_errors_fatal 44071 1727204725.37558: checking for max_fail_percentage 44071 1727204725.37560: done checking for max_fail_percentage 44071 1727204725.37561: checking to see if all hosts have failed and the running result is not ok 44071 1727204725.37562: done checking to see if all hosts have failed 44071 1727204725.37562: getting the remaining hosts for this loop 44071 1727204725.37564: done getting the remaining hosts for this loop 44071 1727204725.37571: getting the next task for host managed-node2 44071 1727204725.37581: done getting next task for host managed-node2 44071 1727204725.37584: ^ task is: TASK: meta (role_complete) 44071 1727204725.37589: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204725.37601: getting variables 44071 1727204725.37603: in VariableManager get_vars() 44071 1727204725.37647: Calling all_inventory to load vars for managed-node2 44071 1727204725.37650: Calling groups_inventory to load vars for managed-node2 44071 1727204725.37652: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204725.37662: Calling all_plugins_play to load vars for managed-node2 44071 1727204725.37664: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204725.37743: Calling groups_plugins_play to load vars for managed-node2 44071 1727204725.39423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204725.40651: done with get_vars() 44071 1727204725.40683: done getting variables 44071 1727204725.40755: done queuing things up, now waiting for results queue to drain 44071 1727204725.40757: results queue empty 44071 1727204725.40757: checking for any_errors_fatal 44071 1727204725.40760: done checking for any_errors_fatal 44071 1727204725.40760: checking for max_fail_percentage 44071 1727204725.40761: done checking for max_fail_percentage 44071 1727204725.40762: checking to see if all hosts have failed and the running result is not ok 44071 1727204725.40762: done checking to see if all hosts have failed 44071 1727204725.40763: getting the remaining hosts for this loop 44071 1727204725.40763: done getting the remaining hosts for this loop 44071 1727204725.40767: getting the next task for host managed-node2 44071 1727204725.40771: done getting next task for host managed-node2 44071 1727204725.40773: ^ task is: TASK: Show result 44071 1727204725.40775: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204725.40777: getting variables 44071 1727204725.40778: in VariableManager get_vars() 44071 1727204725.40788: Calling all_inventory to load vars for managed-node2 44071 1727204725.40790: Calling groups_inventory to load vars for managed-node2 44071 1727204725.40792: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204725.40796: Calling all_plugins_play to load vars for managed-node2 44071 1727204725.40797: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204725.40799: Calling groups_plugins_play to load vars for managed-node2 44071 1727204725.42401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204725.44911: done with get_vars() 44071 1727204725.44957: done getting variables 44071 1727204725.45047: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Tuesday 24 September 2024 15:05:25 -0400 (0:00:00.529) 0:02:17.767 ***** 44071 1727204725.45086: entering _queue_task() for managed-node2/debug 44071 1727204725.45553: worker is 1 (out of 1 available) 44071 1727204725.45571: exiting _queue_task() for managed-node2/debug 44071 1727204725.45584: done queuing things up, now waiting for results queue to drain 44071 1727204725.45586: waiting for pending results... 44071 1727204725.45907: running TaskExecutor() for managed-node2/TASK: Show result 44071 1727204725.46103: in run() - task 127b8e07-fff9-c964-7471-00000000213a 44071 1727204725.46107: variable 'ansible_search_path' from source: unknown 44071 1727204725.46110: variable 'ansible_search_path' from source: unknown 44071 1727204725.46202: calling self._execute() 44071 1727204725.46286: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204725.46298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204725.46322: variable 'omit' from source: magic vars 44071 1727204725.46718: variable 'ansible_distribution_major_version' from source: facts 44071 1727204725.46729: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204725.46735: variable 'omit' from source: magic vars 44071 1727204725.46782: variable 'omit' from source: magic vars 44071 1727204725.46810: variable 'omit' from source: magic vars 44071 1727204725.46873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204725.46893: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204725.46911: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204725.46925: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204725.46937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204725.46962: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204725.46967: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204725.46970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204725.47050: Set connection var ansible_connection to ssh 44071 1727204725.47056: Set connection var ansible_timeout to 10 44071 1727204725.47062: Set connection var ansible_pipelining to False 44071 1727204725.47068: Set connection var ansible_shell_type to sh 44071 1727204725.47080: Set connection var ansible_shell_executable to /bin/sh 44071 1727204725.47083: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204725.47103: variable 'ansible_shell_executable' from source: unknown 44071 1727204725.47106: variable 'ansible_connection' from source: unknown 44071 1727204725.47109: variable 'ansible_module_compression' from source: unknown 44071 1727204725.47112: variable 'ansible_shell_type' from source: unknown 44071 1727204725.47114: variable 'ansible_shell_executable' from source: unknown 44071 1727204725.47117: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204725.47121: variable 'ansible_pipelining' from source: unknown 44071 1727204725.47123: variable 'ansible_timeout' from source: unknown 44071 1727204725.47128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204725.47250: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204725.47259: variable 'omit' from source: magic vars 44071 1727204725.47264: starting attempt loop 44071 1727204725.47270: running the handler 44071 1727204725.47311: variable '__network_connections_result' from source: set_fact 44071 1727204725.47379: variable '__network_connections_result' from source: set_fact 44071 1727204725.47474: handler run complete 44071 1727204725.47494: attempt loop complete, returning result 44071 1727204725.47498: _execute() done 44071 1727204725.47501: dumping result to json 44071 1727204725.47506: done dumping result, returning 44071 1727204725.47514: done running TaskExecutor() for managed-node2/TASK: Show result [127b8e07-fff9-c964-7471-00000000213a] 44071 1727204725.47517: sending task result for task 127b8e07-fff9-c964-7471-00000000213a 44071 1727204725.47629: done sending task result for task 127b8e07-fff9-c964-7471-00000000213a 44071 1727204725.47632: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01" ] } } 44071 1727204725.47718: no more pending results, returning what we have 44071 1727204725.47721: results queue empty 44071 1727204725.47722: checking for any_errors_fatal 44071 1727204725.47724: done checking for any_errors_fatal 44071 1727204725.47725: checking for max_fail_percentage 44071 1727204725.47726: done checking for max_fail_percentage 44071 1727204725.47727: checking to see if all hosts have failed and the running result is not ok 44071 1727204725.47728: done checking to see if all hosts have failed 44071 1727204725.47729: getting the remaining hosts for this loop 44071 1727204725.47730: done getting the remaining hosts for this loop 44071 1727204725.47735: getting the next task for host managed-node2 44071 1727204725.47758: done getting next task for host managed-node2 44071 1727204725.47762: ^ task is: TASK: Include network role 44071 1727204725.47767: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204725.47771: getting variables 44071 1727204725.47772: in VariableManager get_vars() 44071 1727204725.47812: Calling all_inventory to load vars for managed-node2 44071 1727204725.47814: Calling groups_inventory to load vars for managed-node2 44071 1727204725.47818: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204725.47830: Calling all_plugins_play to load vars for managed-node2 44071 1727204725.47833: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204725.47836: Calling groups_plugins_play to load vars for managed-node2 44071 1727204725.49025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204725.50691: done with get_vars() 44071 1727204725.50711: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Tuesday 24 September 2024 15:05:25 -0400 (0:00:00.057) 0:02:17.824 ***** 44071 1727204725.50795: entering _queue_task() for managed-node2/include_role 44071 1727204725.51095: worker is 1 (out of 1 available) 44071 1727204725.51109: exiting _queue_task() for managed-node2/include_role 44071 1727204725.51123: done queuing things up, now waiting for results queue to drain 44071 1727204725.51125: waiting for pending results... 44071 1727204725.51329: running TaskExecutor() for managed-node2/TASK: Include network role 44071 1727204725.51443: in run() - task 127b8e07-fff9-c964-7471-00000000213e 44071 1727204725.51457: variable 'ansible_search_path' from source: unknown 44071 1727204725.51461: variable 'ansible_search_path' from source: unknown 44071 1727204725.51495: calling self._execute() 44071 1727204725.51586: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204725.51592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204725.51601: variable 'omit' from source: magic vars 44071 1727204725.51925: variable 'ansible_distribution_major_version' from source: facts 44071 1727204725.51940: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204725.51946: _execute() done 44071 1727204725.51949: dumping result to json 44071 1727204725.51954: done dumping result, returning 44071 1727204725.51961: done running TaskExecutor() for managed-node2/TASK: Include network role [127b8e07-fff9-c964-7471-00000000213e] 44071 1727204725.51964: sending task result for task 127b8e07-fff9-c964-7471-00000000213e 44071 1727204725.52091: done sending task result for task 127b8e07-fff9-c964-7471-00000000213e 44071 1727204725.52095: WORKER PROCESS EXITING 44071 1727204725.52128: no more pending results, returning what we have 44071 1727204725.52134: in VariableManager get_vars() 44071 1727204725.52189: Calling all_inventory to load vars for managed-node2 44071 1727204725.52193: Calling groups_inventory to load vars for managed-node2 44071 1727204725.52196: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204725.52216: Calling all_plugins_play to load vars for managed-node2 44071 1727204725.52219: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204725.52222: Calling groups_plugins_play to load vars for managed-node2 44071 1727204725.53283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204725.54520: done with get_vars() 44071 1727204725.54550: variable 'ansible_search_path' from source: unknown 44071 1727204725.54551: variable 'ansible_search_path' from source: unknown 44071 1727204725.54660: variable 'omit' from source: magic vars 44071 1727204725.54694: variable 'omit' from source: magic vars 44071 1727204725.54705: variable 'omit' from source: magic vars 44071 1727204725.54708: we have included files to process 44071 1727204725.54709: generating all_blocks data 44071 1727204725.54711: done generating all_blocks data 44071 1727204725.54716: processing included file: fedora.linux_system_roles.network 44071 1727204725.54732: in VariableManager get_vars() 44071 1727204725.54745: done with get_vars() 44071 1727204725.54768: in VariableManager get_vars() 44071 1727204725.54782: done with get_vars() 44071 1727204725.54813: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44071 1727204725.54911: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44071 1727204725.54969: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44071 1727204725.55287: in VariableManager get_vars() 44071 1727204725.55303: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204725.56943: iterating over new_blocks loaded from include file 44071 1727204725.56946: in VariableManager get_vars() 44071 1727204725.56969: done with get_vars() 44071 1727204725.56970: filtering new block on tags 44071 1727204725.57241: done filtering new block on tags 44071 1727204725.57245: in VariableManager get_vars() 44071 1727204725.57262: done with get_vars() 44071 1727204725.57264: filtering new block on tags 44071 1727204725.57282: done filtering new block on tags 44071 1727204725.57284: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 44071 1727204725.57290: extending task lists for all hosts with included blocks 44071 1727204725.57397: done extending task lists 44071 1727204725.57398: done processing included files 44071 1727204725.57399: results queue empty 44071 1727204725.57400: checking for any_errors_fatal 44071 1727204725.57405: done checking for any_errors_fatal 44071 1727204725.57406: checking for max_fail_percentage 44071 1727204725.57407: done checking for max_fail_percentage 44071 1727204725.57408: checking to see if all hosts have failed and the running result is not ok 44071 1727204725.57408: done checking to see if all hosts have failed 44071 1727204725.57409: getting the remaining hosts for this loop 44071 1727204725.57410: done getting the remaining hosts for this loop 44071 1727204725.57413: getting the next task for host managed-node2 44071 1727204725.57417: done getting next task for host managed-node2 44071 1727204725.57419: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204725.57423: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204725.57434: getting variables 44071 1727204725.57435: in VariableManager get_vars() 44071 1727204725.57448: Calling all_inventory to load vars for managed-node2 44071 1727204725.57451: Calling groups_inventory to load vars for managed-node2 44071 1727204725.57453: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204725.57458: Calling all_plugins_play to load vars for managed-node2 44071 1727204725.57461: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204725.57463: Calling groups_plugins_play to load vars for managed-node2 44071 1727204725.58615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204725.59850: done with get_vars() 44071 1727204725.59881: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:05:25 -0400 (0:00:00.091) 0:02:17.915 ***** 44071 1727204725.59949: entering _queue_task() for managed-node2/include_tasks 44071 1727204725.60256: worker is 1 (out of 1 available) 44071 1727204725.60273: exiting _queue_task() for managed-node2/include_tasks 44071 1727204725.60289: done queuing things up, now waiting for results queue to drain 44071 1727204725.60291: waiting for pending results... 44071 1727204725.60504: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204725.60605: in run() - task 127b8e07-fff9-c964-7471-000000002328 44071 1727204725.60621: variable 'ansible_search_path' from source: unknown 44071 1727204725.60624: variable 'ansible_search_path' from source: unknown 44071 1727204725.60659: calling self._execute() 44071 1727204725.60746: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204725.60753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204725.60769: variable 'omit' from source: magic vars 44071 1727204725.61372: variable 'ansible_distribution_major_version' from source: facts 44071 1727204725.61376: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204725.61379: _execute() done 44071 1727204725.61381: dumping result to json 44071 1727204725.61384: done dumping result, returning 44071 1727204725.61387: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-c964-7471-000000002328] 44071 1727204725.61390: sending task result for task 127b8e07-fff9-c964-7471-000000002328 44071 1727204725.61541: no more pending results, returning what we have 44071 1727204725.61547: in VariableManager get_vars() 44071 1727204725.61806: Calling all_inventory to load vars for managed-node2 44071 1727204725.61810: Calling groups_inventory to load vars for managed-node2 44071 1727204725.61812: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204725.61829: Calling all_plugins_play to load vars for managed-node2 44071 1727204725.61833: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204725.61838: Calling groups_plugins_play to load vars for managed-node2 44071 1727204725.62384: done sending task result for task 127b8e07-fff9-c964-7471-000000002328 44071 1727204725.62387: WORKER PROCESS EXITING 44071 1727204725.63178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204725.64649: done with get_vars() 44071 1727204725.64845: variable 'ansible_search_path' from source: unknown 44071 1727204725.64847: variable 'ansible_search_path' from source: unknown 44071 1727204725.64896: we have included files to process 44071 1727204725.64897: generating all_blocks data 44071 1727204725.64899: done generating all_blocks data 44071 1727204725.64902: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204725.64904: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204725.64907: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204725.65526: done processing included file 44071 1727204725.65528: iterating over new_blocks loaded from include file 44071 1727204725.65530: in VariableManager get_vars() 44071 1727204725.65562: done with get_vars() 44071 1727204725.65564: filtering new block on tags 44071 1727204725.65600: done filtering new block on tags 44071 1727204725.65603: in VariableManager get_vars() 44071 1727204725.65630: done with get_vars() 44071 1727204725.65632: filtering new block on tags 44071 1727204725.65691: done filtering new block on tags 44071 1727204725.65694: in VariableManager get_vars() 44071 1727204725.65721: done with get_vars() 44071 1727204725.65723: filtering new block on tags 44071 1727204725.65770: done filtering new block on tags 44071 1727204725.65772: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 44071 1727204725.65778: extending task lists for all hosts with included blocks 44071 1727204725.67672: done extending task lists 44071 1727204725.67675: done processing included files 44071 1727204725.67676: results queue empty 44071 1727204725.67676: checking for any_errors_fatal 44071 1727204725.67680: done checking for any_errors_fatal 44071 1727204725.67681: checking for max_fail_percentage 44071 1727204725.67682: done checking for max_fail_percentage 44071 1727204725.67683: checking to see if all hosts have failed and the running result is not ok 44071 1727204725.67684: done checking to see if all hosts have failed 44071 1727204725.67685: getting the remaining hosts for this loop 44071 1727204725.67686: done getting the remaining hosts for this loop 44071 1727204725.67689: getting the next task for host managed-node2 44071 1727204725.67695: done getting next task for host managed-node2 44071 1727204725.67698: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204725.67703: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204725.67717: getting variables 44071 1727204725.67718: in VariableManager get_vars() 44071 1727204725.67738: Calling all_inventory to load vars for managed-node2 44071 1727204725.67741: Calling groups_inventory to load vars for managed-node2 44071 1727204725.67743: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204725.67749: Calling all_plugins_play to load vars for managed-node2 44071 1727204725.67752: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204725.67755: Calling groups_plugins_play to load vars for managed-node2 44071 1727204725.69373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204725.71715: done with get_vars() 44071 1727204725.71751: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:05:25 -0400 (0:00:00.118) 0:02:18.034 ***** 44071 1727204725.71847: entering _queue_task() for managed-node2/setup 44071 1727204725.72299: worker is 1 (out of 1 available) 44071 1727204725.72316: exiting _queue_task() for managed-node2/setup 44071 1727204725.72331: done queuing things up, now waiting for results queue to drain 44071 1727204725.72333: waiting for pending results... 44071 1727204725.72593: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204725.72872: in run() - task 127b8e07-fff9-c964-7471-00000000237f 44071 1727204725.72877: variable 'ansible_search_path' from source: unknown 44071 1727204725.72880: variable 'ansible_search_path' from source: unknown 44071 1727204725.72883: calling self._execute() 44071 1727204725.72959: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204725.72976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204725.72991: variable 'omit' from source: magic vars 44071 1727204725.73410: variable 'ansible_distribution_major_version' from source: facts 44071 1727204725.73435: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204725.73677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204725.78273: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204725.78352: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204725.78471: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204725.78475: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204725.78478: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204725.78576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204725.78611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204725.78640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204725.78684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204725.78703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204725.78764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204725.78788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204725.78846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204725.78889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204725.79068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204725.79115: variable '__network_required_facts' from source: role '' defaults 44071 1727204725.79128: variable 'ansible_facts' from source: unknown 44071 1727204725.80511: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44071 1727204725.80517: when evaluation is False, skipping this task 44071 1727204725.80520: _execute() done 44071 1727204725.80522: dumping result to json 44071 1727204725.80525: done dumping result, returning 44071 1727204725.80527: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-c964-7471-00000000237f] 44071 1727204725.80572: sending task result for task 127b8e07-fff9-c964-7471-00000000237f 44071 1727204725.80647: done sending task result for task 127b8e07-fff9-c964-7471-00000000237f 44071 1727204725.80651: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204725.80713: no more pending results, returning what we have 44071 1727204725.80717: results queue empty 44071 1727204725.80718: checking for any_errors_fatal 44071 1727204725.80720: done checking for any_errors_fatal 44071 1727204725.80721: checking for max_fail_percentage 44071 1727204725.80723: done checking for max_fail_percentage 44071 1727204725.80724: checking to see if all hosts have failed and the running result is not ok 44071 1727204725.80725: done checking to see if all hosts have failed 44071 1727204725.80725: getting the remaining hosts for this loop 44071 1727204725.80727: done getting the remaining hosts for this loop 44071 1727204725.80735: getting the next task for host managed-node2 44071 1727204725.80747: done getting next task for host managed-node2 44071 1727204725.80752: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204725.80759: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204725.80799: getting variables 44071 1727204725.80802: in VariableManager get_vars() 44071 1727204725.80858: Calling all_inventory to load vars for managed-node2 44071 1727204725.80862: Calling groups_inventory to load vars for managed-node2 44071 1727204725.80864: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204725.81082: Calling all_plugins_play to load vars for managed-node2 44071 1727204725.81086: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204725.81096: Calling groups_plugins_play to load vars for managed-node2 44071 1727204725.83723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204725.87372: done with get_vars() 44071 1727204725.87416: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:05:25 -0400 (0:00:00.156) 0:02:18.191 ***** 44071 1727204725.87539: entering _queue_task() for managed-node2/stat 44071 1727204725.88145: worker is 1 (out of 1 available) 44071 1727204725.88163: exiting _queue_task() for managed-node2/stat 44071 1727204725.88182: done queuing things up, now waiting for results queue to drain 44071 1727204725.88184: waiting for pending results... 44071 1727204725.88910: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204725.89276: in run() - task 127b8e07-fff9-c964-7471-000000002381 44071 1727204725.89375: variable 'ansible_search_path' from source: unknown 44071 1727204725.89379: variable 'ansible_search_path' from source: unknown 44071 1727204725.89387: calling self._execute() 44071 1727204725.89482: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204725.89573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204725.89577: variable 'omit' from source: magic vars 44071 1727204725.90108: variable 'ansible_distribution_major_version' from source: facts 44071 1727204725.90120: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204725.90476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204725.90811: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204725.90873: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204725.90908: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204725.90945: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204725.91049: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204725.91077: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204725.91104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204725.91140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204725.91346: variable '__network_is_ostree' from source: set_fact 44071 1727204725.91349: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204725.91353: when evaluation is False, skipping this task 44071 1727204725.91356: _execute() done 44071 1727204725.91358: dumping result to json 44071 1727204725.91361: done dumping result, returning 44071 1727204725.91370: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-c964-7471-000000002381] 44071 1727204725.91374: sending task result for task 127b8e07-fff9-c964-7471-000000002381 44071 1727204725.91460: done sending task result for task 127b8e07-fff9-c964-7471-000000002381 44071 1727204725.91464: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204725.91529: no more pending results, returning what we have 44071 1727204725.91535: results queue empty 44071 1727204725.91537: checking for any_errors_fatal 44071 1727204725.91549: done checking for any_errors_fatal 44071 1727204725.91550: checking for max_fail_percentage 44071 1727204725.91552: done checking for max_fail_percentage 44071 1727204725.91553: checking to see if all hosts have failed and the running result is not ok 44071 1727204725.91554: done checking to see if all hosts have failed 44071 1727204725.91555: getting the remaining hosts for this loop 44071 1727204725.91556: done getting the remaining hosts for this loop 44071 1727204725.91562: getting the next task for host managed-node2 44071 1727204725.91575: done getting next task for host managed-node2 44071 1727204725.91579: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204725.91587: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204725.91614: getting variables 44071 1727204725.91616: in VariableManager get_vars() 44071 1727204725.92012: Calling all_inventory to load vars for managed-node2 44071 1727204725.92017: Calling groups_inventory to load vars for managed-node2 44071 1727204725.92020: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204725.92032: Calling all_plugins_play to load vars for managed-node2 44071 1727204725.92038: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204725.92042: Calling groups_plugins_play to load vars for managed-node2 44071 1727204725.96338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204726.01264: done with get_vars() 44071 1727204726.01725: done getting variables 44071 1727204726.01803: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:05:26 -0400 (0:00:00.143) 0:02:18.334 ***** 44071 1727204726.01854: entering _queue_task() for managed-node2/set_fact 44071 1727204726.02418: worker is 1 (out of 1 available) 44071 1727204726.02435: exiting _queue_task() for managed-node2/set_fact 44071 1727204726.02450: done queuing things up, now waiting for results queue to drain 44071 1727204726.02452: waiting for pending results... 44071 1727204726.03026: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204726.03032: in run() - task 127b8e07-fff9-c964-7471-000000002382 44071 1727204726.03036: variable 'ansible_search_path' from source: unknown 44071 1727204726.03039: variable 'ansible_search_path' from source: unknown 44071 1727204726.03086: calling self._execute() 44071 1727204726.03229: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204726.03234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204726.03237: variable 'omit' from source: magic vars 44071 1727204726.03675: variable 'ansible_distribution_major_version' from source: facts 44071 1727204726.03689: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204726.03894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204726.04403: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204726.04408: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204726.04411: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204726.04444: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204726.04673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204726.04702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204726.04729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204726.04758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204726.04872: variable '__network_is_ostree' from source: set_fact 44071 1727204726.04880: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204726.04883: when evaluation is False, skipping this task 44071 1727204726.04886: _execute() done 44071 1727204726.04896: dumping result to json 44071 1727204726.04899: done dumping result, returning 44071 1727204726.04911: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-000000002382] 44071 1727204726.04914: sending task result for task 127b8e07-fff9-c964-7471-000000002382 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204726.05082: no more pending results, returning what we have 44071 1727204726.05087: results queue empty 44071 1727204726.05088: checking for any_errors_fatal 44071 1727204726.05099: done checking for any_errors_fatal 44071 1727204726.05100: checking for max_fail_percentage 44071 1727204726.05102: done checking for max_fail_percentage 44071 1727204726.05103: checking to see if all hosts have failed and the running result is not ok 44071 1727204726.05104: done checking to see if all hosts have failed 44071 1727204726.05104: getting the remaining hosts for this loop 44071 1727204726.05107: done getting the remaining hosts for this loop 44071 1727204726.05113: getting the next task for host managed-node2 44071 1727204726.05126: done getting next task for host managed-node2 44071 1727204726.05131: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204726.05141: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204726.05173: getting variables 44071 1727204726.05175: in VariableManager get_vars() 44071 1727204726.05241: Calling all_inventory to load vars for managed-node2 44071 1727204726.05244: Calling groups_inventory to load vars for managed-node2 44071 1727204726.05247: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204726.05260: Calling all_plugins_play to load vars for managed-node2 44071 1727204726.05263: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204726.05378: done sending task result for task 127b8e07-fff9-c964-7471-000000002382 44071 1727204726.05382: WORKER PROCESS EXITING 44071 1727204726.05572: Calling groups_plugins_play to load vars for managed-node2 44071 1727204726.07676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204726.10923: done with get_vars() 44071 1727204726.11175: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:05:26 -0400 (0:00:00.094) 0:02:18.429 ***** 44071 1727204726.11298: entering _queue_task() for managed-node2/service_facts 44071 1727204726.12153: worker is 1 (out of 1 available) 44071 1727204726.12237: exiting _queue_task() for managed-node2/service_facts 44071 1727204726.12253: done queuing things up, now waiting for results queue to drain 44071 1727204726.12255: waiting for pending results... 44071 1727204726.12543: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204726.12951: in run() - task 127b8e07-fff9-c964-7471-000000002384 44071 1727204726.13262: variable 'ansible_search_path' from source: unknown 44071 1727204726.13271: variable 'ansible_search_path' from source: unknown 44071 1727204726.13275: calling self._execute() 44071 1727204726.13279: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204726.13282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204726.13487: variable 'omit' from source: magic vars 44071 1727204726.14443: variable 'ansible_distribution_major_version' from source: facts 44071 1727204726.14602: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204726.14785: variable 'omit' from source: magic vars 44071 1727204726.14790: variable 'omit' from source: magic vars 44071 1727204726.14793: variable 'omit' from source: magic vars 44071 1727204726.14919: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204726.14964: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204726.15004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204726.15030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204726.15052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204726.15093: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204726.15107: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204726.15117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204726.15257: Set connection var ansible_connection to ssh 44071 1727204726.15274: Set connection var ansible_timeout to 10 44071 1727204726.15286: Set connection var ansible_pipelining to False 44071 1727204726.15295: Set connection var ansible_shell_type to sh 44071 1727204726.15306: Set connection var ansible_shell_executable to /bin/sh 44071 1727204726.15318: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204726.15355: variable 'ansible_shell_executable' from source: unknown 44071 1727204726.15363: variable 'ansible_connection' from source: unknown 44071 1727204726.15374: variable 'ansible_module_compression' from source: unknown 44071 1727204726.15436: variable 'ansible_shell_type' from source: unknown 44071 1727204726.15439: variable 'ansible_shell_executable' from source: unknown 44071 1727204726.15442: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204726.15444: variable 'ansible_pipelining' from source: unknown 44071 1727204726.15446: variable 'ansible_timeout' from source: unknown 44071 1727204726.15449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204726.15655: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204726.15677: variable 'omit' from source: magic vars 44071 1727204726.15687: starting attempt loop 44071 1727204726.15694: running the handler 44071 1727204726.15713: _low_level_execute_command(): starting 44071 1727204726.15726: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204726.16525: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204726.16636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204726.16655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204726.16684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204726.16702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204726.16727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204726.16868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204726.18631: stdout chunk (state=3): >>>/root <<< 44071 1727204726.18788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204726.18849: stderr chunk (state=3): >>><<< 44071 1727204726.18864: stdout chunk (state=3): >>><<< 44071 1727204726.18896: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204726.19016: _low_level_execute_command(): starting 44071 1727204726.19021: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204726.1890757-52037-250602423252393 `" && echo ansible-tmp-1727204726.1890757-52037-250602423252393="` echo /root/.ansible/tmp/ansible-tmp-1727204726.1890757-52037-250602423252393 `" ) && sleep 0' 44071 1727204726.19617: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204726.19632: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204726.19646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204726.19663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204726.19681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204726.19705: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204726.19783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204726.19824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204726.19840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204726.19867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204726.19980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204726.21988: stdout chunk (state=3): >>>ansible-tmp-1727204726.1890757-52037-250602423252393=/root/.ansible/tmp/ansible-tmp-1727204726.1890757-52037-250602423252393 <<< 44071 1727204726.22215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204726.22219: stdout chunk (state=3): >>><<< 44071 1727204726.22222: stderr chunk (state=3): >>><<< 44071 1727204726.22242: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204726.1890757-52037-250602423252393=/root/.ansible/tmp/ansible-tmp-1727204726.1890757-52037-250602423252393 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204726.22322: variable 'ansible_module_compression' from source: unknown 44071 1727204726.22375: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44071 1727204726.22536: variable 'ansible_facts' from source: unknown 44071 1727204726.22539: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204726.1890757-52037-250602423252393/AnsiballZ_service_facts.py 44071 1727204726.22800: Sending initial data 44071 1727204726.22814: Sent initial data (162 bytes) 44071 1727204726.23492: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204726.23543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204726.23547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204726.23572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204726.23674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204726.25331: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 44071 1727204726.25377: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204726.25432: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204726.25521: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpxoil07mh /root/.ansible/tmp/ansible-tmp-1727204726.1890757-52037-250602423252393/AnsiballZ_service_facts.py <<< 44071 1727204726.25524: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204726.1890757-52037-250602423252393/AnsiballZ_service_facts.py" <<< 44071 1727204726.25610: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpxoil07mh" to remote "/root/.ansible/tmp/ansible-tmp-1727204726.1890757-52037-250602423252393/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204726.1890757-52037-250602423252393/AnsiballZ_service_facts.py" <<< 44071 1727204726.26473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204726.26714: stderr chunk (state=3): >>><<< 44071 1727204726.26718: stdout chunk (state=3): >>><<< 44071 1727204726.26720: done transferring module to remote 44071 1727204726.26723: _low_level_execute_command(): starting 44071 1727204726.26726: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204726.1890757-52037-250602423252393/ /root/.ansible/tmp/ansible-tmp-1727204726.1890757-52037-250602423252393/AnsiballZ_service_facts.py && sleep 0' 44071 1727204726.27343: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204726.27361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204726.27379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204726.27400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204726.27509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204726.27547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204726.27656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204726.29602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204726.29632: stdout chunk (state=3): >>><<< 44071 1727204726.29636: stderr chunk (state=3): >>><<< 44071 1727204726.29672: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204726.29676: _low_level_execute_command(): starting 44071 1727204726.29679: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204726.1890757-52037-250602423252393/AnsiballZ_service_facts.py && sleep 0' 44071 1727204726.30373: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204726.30393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204726.30409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204726.30484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204726.30546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204726.30568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204726.30595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204726.30710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204728.51256: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "s<<< 44071 1727204728.51299: stdout chunk (state=3): >>>topped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.fre<<< 44071 1727204728.51353: stdout chunk (state=3): >>>edesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44071 1727204728.52955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204728.52980: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 44071 1727204728.53173: stderr chunk (state=3): >>><<< 44071 1727204728.53177: stdout chunk (state=3): >>><<< 44071 1727204728.53184: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204728.54058: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204726.1890757-52037-250602423252393/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204728.54083: _low_level_execute_command(): starting 44071 1727204728.54095: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204726.1890757-52037-250602423252393/ > /dev/null 2>&1 && sleep 0' 44071 1727204728.54590: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204728.54605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204728.54617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204728.54658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204728.54675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204728.54756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204728.56667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204728.56872: stderr chunk (state=3): >>><<< 44071 1727204728.56876: stdout chunk (state=3): >>><<< 44071 1727204728.56879: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204728.56882: handler run complete 44071 1727204728.57021: variable 'ansible_facts' from source: unknown 44071 1727204728.57241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204728.57629: variable 'ansible_facts' from source: unknown 44071 1727204728.57731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204728.57887: attempt loop complete, returning result 44071 1727204728.57892: _execute() done 44071 1727204728.57896: dumping result to json 44071 1727204728.57938: done dumping result, returning 44071 1727204728.57948: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-c964-7471-000000002384] 44071 1727204728.57952: sending task result for task 127b8e07-fff9-c964-7471-000000002384 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204728.58751: no more pending results, returning what we have 44071 1727204728.58754: results queue empty 44071 1727204728.58755: checking for any_errors_fatal 44071 1727204728.58760: done checking for any_errors_fatal 44071 1727204728.58760: checking for max_fail_percentage 44071 1727204728.58762: done checking for max_fail_percentage 44071 1727204728.58763: checking to see if all hosts have failed and the running result is not ok 44071 1727204728.58764: done checking to see if all hosts have failed 44071 1727204728.58765: getting the remaining hosts for this loop 44071 1727204728.58771: done getting the remaining hosts for this loop 44071 1727204728.58775: getting the next task for host managed-node2 44071 1727204728.58783: done getting next task for host managed-node2 44071 1727204728.58786: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204728.58792: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204728.58802: done sending task result for task 127b8e07-fff9-c964-7471-000000002384 44071 1727204728.58805: WORKER PROCESS EXITING 44071 1727204728.58812: getting variables 44071 1727204728.58813: in VariableManager get_vars() 44071 1727204728.58844: Calling all_inventory to load vars for managed-node2 44071 1727204728.58846: Calling groups_inventory to load vars for managed-node2 44071 1727204728.58848: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204728.58856: Calling all_plugins_play to load vars for managed-node2 44071 1727204728.58857: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204728.58859: Calling groups_plugins_play to load vars for managed-node2 44071 1727204728.59922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204728.61147: done with get_vars() 44071 1727204728.61183: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:05:28 -0400 (0:00:02.499) 0:02:20.929 ***** 44071 1727204728.61271: entering _queue_task() for managed-node2/package_facts 44071 1727204728.61579: worker is 1 (out of 1 available) 44071 1727204728.61594: exiting _queue_task() for managed-node2/package_facts 44071 1727204728.61610: done queuing things up, now waiting for results queue to drain 44071 1727204728.61612: waiting for pending results... 44071 1727204728.61845: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204728.61977: in run() - task 127b8e07-fff9-c964-7471-000000002385 44071 1727204728.61992: variable 'ansible_search_path' from source: unknown 44071 1727204728.61996: variable 'ansible_search_path' from source: unknown 44071 1727204728.62028: calling self._execute() 44071 1727204728.62115: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204728.62121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204728.62129: variable 'omit' from source: magic vars 44071 1727204728.62456: variable 'ansible_distribution_major_version' from source: facts 44071 1727204728.62469: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204728.62475: variable 'omit' from source: magic vars 44071 1727204728.62535: variable 'omit' from source: magic vars 44071 1727204728.62567: variable 'omit' from source: magic vars 44071 1727204728.62603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204728.62638: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204728.62657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204728.62673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204728.62686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204728.62712: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204728.62715: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204728.62718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204728.62797: Set connection var ansible_connection to ssh 44071 1727204728.62803: Set connection var ansible_timeout to 10 44071 1727204728.62810: Set connection var ansible_pipelining to False 44071 1727204728.62816: Set connection var ansible_shell_type to sh 44071 1727204728.62822: Set connection var ansible_shell_executable to /bin/sh 44071 1727204728.62832: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204728.62855: variable 'ansible_shell_executable' from source: unknown 44071 1727204728.62858: variable 'ansible_connection' from source: unknown 44071 1727204728.62861: variable 'ansible_module_compression' from source: unknown 44071 1727204728.62864: variable 'ansible_shell_type' from source: unknown 44071 1727204728.62869: variable 'ansible_shell_executable' from source: unknown 44071 1727204728.62871: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204728.62874: variable 'ansible_pipelining' from source: unknown 44071 1727204728.62876: variable 'ansible_timeout' from source: unknown 44071 1727204728.62951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204728.63051: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204728.63061: variable 'omit' from source: magic vars 44071 1727204728.63065: starting attempt loop 44071 1727204728.63073: running the handler 44071 1727204728.63082: _low_level_execute_command(): starting 44071 1727204728.63089: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204728.63837: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204728.63843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204728.63908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204728.65564: stdout chunk (state=3): >>>/root <<< 44071 1727204728.65660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204728.65724: stderr chunk (state=3): >>><<< 44071 1727204728.65729: stdout chunk (state=3): >>><<< 44071 1727204728.65751: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204728.65764: _low_level_execute_command(): starting 44071 1727204728.65773: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204728.657521-52162-149685333991731 `" && echo ansible-tmp-1727204728.657521-52162-149685333991731="` echo /root/.ansible/tmp/ansible-tmp-1727204728.657521-52162-149685333991731 `" ) && sleep 0' 44071 1727204728.66265: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204728.66272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204728.66275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204728.66288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204728.66341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204728.66345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204728.66348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204728.66412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204728.68370: stdout chunk (state=3): >>>ansible-tmp-1727204728.657521-52162-149685333991731=/root/.ansible/tmp/ansible-tmp-1727204728.657521-52162-149685333991731 <<< 44071 1727204728.68480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204728.68547: stderr chunk (state=3): >>><<< 44071 1727204728.68551: stdout chunk (state=3): >>><<< 44071 1727204728.68566: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204728.657521-52162-149685333991731=/root/.ansible/tmp/ansible-tmp-1727204728.657521-52162-149685333991731 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204728.68613: variable 'ansible_module_compression' from source: unknown 44071 1727204728.68657: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44071 1727204728.68714: variable 'ansible_facts' from source: unknown 44071 1727204728.68840: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204728.657521-52162-149685333991731/AnsiballZ_package_facts.py 44071 1727204728.68964: Sending initial data 44071 1727204728.68970: Sent initial data (161 bytes) 44071 1727204728.69479: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204728.69484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204728.69487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204728.69489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204728.69491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204728.69546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204728.69553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204728.69555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204728.69632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204728.71229: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204728.71296: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204728.71370: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp3kq4ftwl /root/.ansible/tmp/ansible-tmp-1727204728.657521-52162-149685333991731/AnsiballZ_package_facts.py <<< 44071 1727204728.71373: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204728.657521-52162-149685333991731/AnsiballZ_package_facts.py" <<< 44071 1727204728.71437: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp3kq4ftwl" to remote "/root/.ansible/tmp/ansible-tmp-1727204728.657521-52162-149685333991731/AnsiballZ_package_facts.py" <<< 44071 1727204728.71440: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204728.657521-52162-149685333991731/AnsiballZ_package_facts.py" <<< 44071 1727204728.72680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204728.72753: stderr chunk (state=3): >>><<< 44071 1727204728.72758: stdout chunk (state=3): >>><<< 44071 1727204728.72784: done transferring module to remote 44071 1727204728.72792: _low_level_execute_command(): starting 44071 1727204728.72797: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204728.657521-52162-149685333991731/ /root/.ansible/tmp/ansible-tmp-1727204728.657521-52162-149685333991731/AnsiballZ_package_facts.py && sleep 0' 44071 1727204728.73273: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204728.73297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204728.73302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204728.73354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204728.73358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204728.73360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204728.73438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204728.75256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204728.75319: stderr chunk (state=3): >>><<< 44071 1727204728.75323: stdout chunk (state=3): >>><<< 44071 1727204728.75339: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204728.75342: _low_level_execute_command(): starting 44071 1727204728.75347: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204728.657521-52162-149685333991731/AnsiballZ_package_facts.py && sleep 0' 44071 1727204728.75841: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204728.75846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204728.75848: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204728.75851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204728.75908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204728.75911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204728.75914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204728.75994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204729.38196: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 44071 1727204729.38315: stdout chunk (state=3): >>>me": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lib<<< 44071 1727204729.38523: stdout chunk (state=3): >>>xmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarc<<< 44071 1727204729.38531: stdout chunk (state=3): >>>h", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44071 1727204729.40226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204729.40245: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 44071 1727204729.40409: stderr chunk (state=3): >>><<< 44071 1727204729.40419: stdout chunk (state=3): >>><<< 44071 1727204729.40681: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204729.46376: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204728.657521-52162-149685333991731/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204729.46418: _low_level_execute_command(): starting 44071 1727204729.46522: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204728.657521-52162-149685333991731/ > /dev/null 2>&1 && sleep 0' 44071 1727204729.47716: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204729.47779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204729.47813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204729.47883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204729.48046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204729.50379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204729.50384: stdout chunk (state=3): >>><<< 44071 1727204729.50387: stderr chunk (state=3): >>><<< 44071 1727204729.50389: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204729.50392: handler run complete 44071 1727204729.52789: variable 'ansible_facts' from source: unknown 44071 1727204729.54275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204729.60437: variable 'ansible_facts' from source: unknown 44071 1727204729.82123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204729.84273: attempt loop complete, returning result 44071 1727204729.84278: _execute() done 44071 1727204729.84280: dumping result to json 44071 1727204729.84629: done dumping result, returning 44071 1727204729.85073: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-c964-7471-000000002385] 44071 1727204729.85078: sending task result for task 127b8e07-fff9-c964-7471-000000002385 44071 1727204729.90733: done sending task result for task 127b8e07-fff9-c964-7471-000000002385 44071 1727204729.90737: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204729.90927: no more pending results, returning what we have 44071 1727204729.90930: results queue empty 44071 1727204729.90931: checking for any_errors_fatal 44071 1727204729.90939: done checking for any_errors_fatal 44071 1727204729.90940: checking for max_fail_percentage 44071 1727204729.90942: done checking for max_fail_percentage 44071 1727204729.90942: checking to see if all hosts have failed and the running result is not ok 44071 1727204729.90943: done checking to see if all hosts have failed 44071 1727204729.90944: getting the remaining hosts for this loop 44071 1727204729.90946: done getting the remaining hosts for this loop 44071 1727204729.90950: getting the next task for host managed-node2 44071 1727204729.90959: done getting next task for host managed-node2 44071 1727204729.90962: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204729.91074: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204729.91090: getting variables 44071 1727204729.91091: in VariableManager get_vars() 44071 1727204729.91131: Calling all_inventory to load vars for managed-node2 44071 1727204729.91134: Calling groups_inventory to load vars for managed-node2 44071 1727204729.91136: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204729.91148: Calling all_plugins_play to load vars for managed-node2 44071 1727204729.91151: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204729.91154: Calling groups_plugins_play to load vars for managed-node2 44071 1727204730.12088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204730.16555: done with get_vars() 44071 1727204730.16601: done getting variables 44071 1727204730.16655: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:05:30 -0400 (0:00:01.556) 0:02:22.485 ***** 44071 1727204730.16901: entering _queue_task() for managed-node2/debug 44071 1727204730.17719: worker is 1 (out of 1 available) 44071 1727204730.17735: exiting _queue_task() for managed-node2/debug 44071 1727204730.17750: done queuing things up, now waiting for results queue to drain 44071 1727204730.17754: waiting for pending results... 44071 1727204730.18391: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204730.18699: in run() - task 127b8e07-fff9-c964-7471-000000002329 44071 1727204730.18772: variable 'ansible_search_path' from source: unknown 44071 1727204730.18859: variable 'ansible_search_path' from source: unknown 44071 1727204730.18911: calling self._execute() 44071 1727204730.19148: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204730.19164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204730.19392: variable 'omit' from source: magic vars 44071 1727204730.20219: variable 'ansible_distribution_major_version' from source: facts 44071 1727204730.20279: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204730.20293: variable 'omit' from source: magic vars 44071 1727204730.20451: variable 'omit' from source: magic vars 44071 1727204730.20716: variable 'network_provider' from source: set_fact 44071 1727204730.20745: variable 'omit' from source: magic vars 44071 1727204730.20808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204730.20854: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204730.20959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204730.20963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204730.20984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204730.21027: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204730.21039: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204730.21106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204730.21186: Set connection var ansible_connection to ssh 44071 1727204730.21202: Set connection var ansible_timeout to 10 44071 1727204730.21218: Set connection var ansible_pipelining to False 44071 1727204730.21235: Set connection var ansible_shell_type to sh 44071 1727204730.21246: Set connection var ansible_shell_executable to /bin/sh 44071 1727204730.21258: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204730.21291: variable 'ansible_shell_executable' from source: unknown 44071 1727204730.21299: variable 'ansible_connection' from source: unknown 44071 1727204730.21306: variable 'ansible_module_compression' from source: unknown 44071 1727204730.21313: variable 'ansible_shell_type' from source: unknown 44071 1727204730.21322: variable 'ansible_shell_executable' from source: unknown 44071 1727204730.21338: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204730.21430: variable 'ansible_pipelining' from source: unknown 44071 1727204730.21436: variable 'ansible_timeout' from source: unknown 44071 1727204730.21440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204730.21529: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204730.21649: variable 'omit' from source: magic vars 44071 1727204730.21652: starting attempt loop 44071 1727204730.21654: running the handler 44071 1727204730.21657: handler run complete 44071 1727204730.21660: attempt loop complete, returning result 44071 1727204730.21662: _execute() done 44071 1727204730.21665: dumping result to json 44071 1727204730.21667: done dumping result, returning 44071 1727204730.21757: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-c964-7471-000000002329] 44071 1727204730.21761: sending task result for task 127b8e07-fff9-c964-7471-000000002329 44071 1727204730.22070: done sending task result for task 127b8e07-fff9-c964-7471-000000002329 44071 1727204730.22074: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 44071 1727204730.22149: no more pending results, returning what we have 44071 1727204730.22152: results queue empty 44071 1727204730.22153: checking for any_errors_fatal 44071 1727204730.22161: done checking for any_errors_fatal 44071 1727204730.22162: checking for max_fail_percentage 44071 1727204730.22164: done checking for max_fail_percentage 44071 1727204730.22165: checking to see if all hosts have failed and the running result is not ok 44071 1727204730.22167: done checking to see if all hosts have failed 44071 1727204730.22168: getting the remaining hosts for this loop 44071 1727204730.22170: done getting the remaining hosts for this loop 44071 1727204730.22174: getting the next task for host managed-node2 44071 1727204730.22184: done getting next task for host managed-node2 44071 1727204730.22189: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204730.22194: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204730.22209: getting variables 44071 1727204730.22210: in VariableManager get_vars() 44071 1727204730.22261: Calling all_inventory to load vars for managed-node2 44071 1727204730.22264: Calling groups_inventory to load vars for managed-node2 44071 1727204730.22271: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204730.22283: Calling all_plugins_play to load vars for managed-node2 44071 1727204730.22287: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204730.22290: Calling groups_plugins_play to load vars for managed-node2 44071 1727204730.24369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204730.26920: done with get_vars() 44071 1727204730.26963: done getting variables 44071 1727204730.27036: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:05:30 -0400 (0:00:00.101) 0:02:22.587 ***** 44071 1727204730.27092: entering _queue_task() for managed-node2/fail 44071 1727204730.27543: worker is 1 (out of 1 available) 44071 1727204730.27561: exiting _queue_task() for managed-node2/fail 44071 1727204730.27577: done queuing things up, now waiting for results queue to drain 44071 1727204730.27579: waiting for pending results... 44071 1727204730.28280: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204730.28880: in run() - task 127b8e07-fff9-c964-7471-00000000232a 44071 1727204730.28886: variable 'ansible_search_path' from source: unknown 44071 1727204730.28889: variable 'ansible_search_path' from source: unknown 44071 1727204730.28892: calling self._execute() 44071 1727204730.29126: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204730.29219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204730.29237: variable 'omit' from source: magic vars 44071 1727204730.30160: variable 'ansible_distribution_major_version' from source: facts 44071 1727204730.30216: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204730.30563: variable 'network_state' from source: role '' defaults 44071 1727204730.30652: Evaluated conditional (network_state != {}): False 44071 1727204730.30662: when evaluation is False, skipping this task 44071 1727204730.30672: _execute() done 44071 1727204730.30854: dumping result to json 44071 1727204730.30858: done dumping result, returning 44071 1727204730.30862: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-c964-7471-00000000232a] 44071 1727204730.30958: sending task result for task 127b8e07-fff9-c964-7471-00000000232a 44071 1727204730.31057: done sending task result for task 127b8e07-fff9-c964-7471-00000000232a 44071 1727204730.31061: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204730.31425: no more pending results, returning what we have 44071 1727204730.31429: results queue empty 44071 1727204730.31431: checking for any_errors_fatal 44071 1727204730.31440: done checking for any_errors_fatal 44071 1727204730.31441: checking for max_fail_percentage 44071 1727204730.31443: done checking for max_fail_percentage 44071 1727204730.31443: checking to see if all hosts have failed and the running result is not ok 44071 1727204730.31444: done checking to see if all hosts have failed 44071 1727204730.31445: getting the remaining hosts for this loop 44071 1727204730.31447: done getting the remaining hosts for this loop 44071 1727204730.31452: getting the next task for host managed-node2 44071 1727204730.31460: done getting next task for host managed-node2 44071 1727204730.31464: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204730.31472: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204730.31499: getting variables 44071 1727204730.31501: in VariableManager get_vars() 44071 1727204730.31553: Calling all_inventory to load vars for managed-node2 44071 1727204730.31556: Calling groups_inventory to load vars for managed-node2 44071 1727204730.31559: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204730.31578: Calling all_plugins_play to load vars for managed-node2 44071 1727204730.31581: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204730.31585: Calling groups_plugins_play to load vars for managed-node2 44071 1727204730.35619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204730.40351: done with get_vars() 44071 1727204730.40400: done getting variables 44071 1727204730.40654: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:05:30 -0400 (0:00:00.136) 0:02:22.724 ***** 44071 1727204730.40771: entering _queue_task() for managed-node2/fail 44071 1727204730.41718: worker is 1 (out of 1 available) 44071 1727204730.41735: exiting _queue_task() for managed-node2/fail 44071 1727204730.41750: done queuing things up, now waiting for results queue to drain 44071 1727204730.41752: waiting for pending results... 44071 1727204730.42440: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204730.42821: in run() - task 127b8e07-fff9-c964-7471-00000000232b 44071 1727204730.42878: variable 'ansible_search_path' from source: unknown 44071 1727204730.42882: variable 'ansible_search_path' from source: unknown 44071 1727204730.43074: calling self._execute() 44071 1727204730.43268: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204730.43510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204730.43514: variable 'omit' from source: magic vars 44071 1727204730.44474: variable 'ansible_distribution_major_version' from source: facts 44071 1727204730.44479: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204730.44761: variable 'network_state' from source: role '' defaults 44071 1727204730.44787: Evaluated conditional (network_state != {}): False 44071 1727204730.44871: when evaluation is False, skipping this task 44071 1727204730.44875: _execute() done 44071 1727204730.44877: dumping result to json 44071 1727204730.44881: done dumping result, returning 44071 1727204730.44893: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-c964-7471-00000000232b] 44071 1727204730.44904: sending task result for task 127b8e07-fff9-c964-7471-00000000232b 44071 1727204730.45267: done sending task result for task 127b8e07-fff9-c964-7471-00000000232b 44071 1727204730.45273: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204730.45333: no more pending results, returning what we have 44071 1727204730.45337: results queue empty 44071 1727204730.45339: checking for any_errors_fatal 44071 1727204730.45350: done checking for any_errors_fatal 44071 1727204730.45351: checking for max_fail_percentage 44071 1727204730.45353: done checking for max_fail_percentage 44071 1727204730.45354: checking to see if all hosts have failed and the running result is not ok 44071 1727204730.45355: done checking to see if all hosts have failed 44071 1727204730.45356: getting the remaining hosts for this loop 44071 1727204730.45357: done getting the remaining hosts for this loop 44071 1727204730.45363: getting the next task for host managed-node2 44071 1727204730.45376: done getting next task for host managed-node2 44071 1727204730.45381: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204730.45390: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204730.45420: getting variables 44071 1727204730.45421: in VariableManager get_vars() 44071 1727204730.45755: Calling all_inventory to load vars for managed-node2 44071 1727204730.45759: Calling groups_inventory to load vars for managed-node2 44071 1727204730.45762: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204730.45775: Calling all_plugins_play to load vars for managed-node2 44071 1727204730.45779: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204730.45782: Calling groups_plugins_play to load vars for managed-node2 44071 1727204730.50182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204730.54974: done with get_vars() 44071 1727204730.55020: done getting variables 44071 1727204730.55203: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:05:30 -0400 (0:00:00.144) 0:02:22.868 ***** 44071 1727204730.55244: entering _queue_task() for managed-node2/fail 44071 1727204730.56061: worker is 1 (out of 1 available) 44071 1727204730.56079: exiting _queue_task() for managed-node2/fail 44071 1727204730.56093: done queuing things up, now waiting for results queue to drain 44071 1727204730.56094: waiting for pending results... 44071 1727204730.56785: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204730.56973: in run() - task 127b8e07-fff9-c964-7471-00000000232c 44071 1727204730.57040: variable 'ansible_search_path' from source: unknown 44071 1727204730.57172: variable 'ansible_search_path' from source: unknown 44071 1727204730.57188: calling self._execute() 44071 1727204730.57422: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204730.57436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204730.57469: variable 'omit' from source: magic vars 44071 1727204730.58574: variable 'ansible_distribution_major_version' from source: facts 44071 1727204730.58579: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204730.58962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204730.64603: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204730.64801: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204730.64903: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204730.65020: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204730.65100: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204730.65305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204730.65417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204730.65455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204730.65507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204730.65584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204730.65864: variable 'ansible_distribution_major_version' from source: facts 44071 1727204730.65896: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44071 1727204730.66153: variable 'ansible_distribution' from source: facts 44071 1727204730.66220: variable '__network_rh_distros' from source: role '' defaults 44071 1727204730.66237: Evaluated conditional (ansible_distribution in __network_rh_distros): False 44071 1727204730.66318: when evaluation is False, skipping this task 44071 1727204730.66323: _execute() done 44071 1727204730.66326: dumping result to json 44071 1727204730.66329: done dumping result, returning 44071 1727204730.66331: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-c964-7471-00000000232c] 44071 1727204730.66337: sending task result for task 127b8e07-fff9-c964-7471-00000000232c 44071 1727204730.66619: done sending task result for task 127b8e07-fff9-c964-7471-00000000232c 44071 1727204730.66622: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 44071 1727204730.66702: no more pending results, returning what we have 44071 1727204730.66706: results queue empty 44071 1727204730.66707: checking for any_errors_fatal 44071 1727204730.66714: done checking for any_errors_fatal 44071 1727204730.66715: checking for max_fail_percentage 44071 1727204730.66716: done checking for max_fail_percentage 44071 1727204730.66717: checking to see if all hosts have failed and the running result is not ok 44071 1727204730.66718: done checking to see if all hosts have failed 44071 1727204730.66719: getting the remaining hosts for this loop 44071 1727204730.66721: done getting the remaining hosts for this loop 44071 1727204730.66726: getting the next task for host managed-node2 44071 1727204730.66735: done getting next task for host managed-node2 44071 1727204730.66740: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204730.66747: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204730.66778: getting variables 44071 1727204730.66780: in VariableManager get_vars() 44071 1727204730.66833: Calling all_inventory to load vars for managed-node2 44071 1727204730.66836: Calling groups_inventory to load vars for managed-node2 44071 1727204730.66839: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204730.66851: Calling all_plugins_play to load vars for managed-node2 44071 1727204730.66855: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204730.66858: Calling groups_plugins_play to load vars for managed-node2 44071 1727204730.71247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204730.77303: done with get_vars() 44071 1727204730.77462: done getting variables 44071 1727204730.77538: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:05:30 -0400 (0:00:00.224) 0:02:23.093 ***** 44071 1727204730.77687: entering _queue_task() for managed-node2/dnf 44071 1727204730.78920: worker is 1 (out of 1 available) 44071 1727204730.78937: exiting _queue_task() for managed-node2/dnf 44071 1727204730.78955: done queuing things up, now waiting for results queue to drain 44071 1727204730.78957: waiting for pending results... 44071 1727204730.79815: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204730.80175: in run() - task 127b8e07-fff9-c964-7471-00000000232d 44071 1727204730.80179: variable 'ansible_search_path' from source: unknown 44071 1727204730.80182: variable 'ansible_search_path' from source: unknown 44071 1727204730.80185: calling self._execute() 44071 1727204730.80380: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204730.80387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204730.80396: variable 'omit' from source: magic vars 44071 1727204730.81579: variable 'ansible_distribution_major_version' from source: facts 44071 1727204730.81584: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204730.81974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204730.87356: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204730.87659: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204730.87905: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204730.87951: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204730.87987: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204730.88276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204730.88309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204730.88335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204730.88384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204730.88398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204730.88536: variable 'ansible_distribution' from source: facts 44071 1727204730.88544: variable 'ansible_distribution_major_version' from source: facts 44071 1727204730.88552: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44071 1727204730.88890: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204730.89249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204730.89276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204730.89301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204730.89343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204730.89357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204730.89610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204730.89634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204730.89833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204730.89836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204730.89839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204730.89841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204730.89988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204730.90013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204730.90058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204730.90076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204730.90468: variable 'network_connections' from source: include params 44071 1727204730.90481: variable 'interface' from source: play vars 44071 1727204730.90559: variable 'interface' from source: play vars 44071 1727204730.90851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204730.91244: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204730.91304: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204730.91339: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204730.91370: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204730.91676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204730.91681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204730.91692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204730.91695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204730.91787: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204730.92446: variable 'network_connections' from source: include params 44071 1727204730.92451: variable 'interface' from source: play vars 44071 1727204730.92526: variable 'interface' from source: play vars 44071 1727204730.92555: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204730.92558: when evaluation is False, skipping this task 44071 1727204730.92561: _execute() done 44071 1727204730.92564: dumping result to json 44071 1727204730.92772: done dumping result, returning 44071 1727204730.92782: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-00000000232d] 44071 1727204730.92789: sending task result for task 127b8e07-fff9-c964-7471-00000000232d skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204730.92958: no more pending results, returning what we have 44071 1727204730.92961: results queue empty 44071 1727204730.92962: checking for any_errors_fatal 44071 1727204730.92973: done checking for any_errors_fatal 44071 1727204730.92974: checking for max_fail_percentage 44071 1727204730.92975: done checking for max_fail_percentage 44071 1727204730.92976: checking to see if all hosts have failed and the running result is not ok 44071 1727204730.92977: done checking to see if all hosts have failed 44071 1727204730.92978: getting the remaining hosts for this loop 44071 1727204730.92980: done getting the remaining hosts for this loop 44071 1727204730.92993: getting the next task for host managed-node2 44071 1727204730.93003: done getting next task for host managed-node2 44071 1727204730.93008: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204730.93014: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204730.93044: getting variables 44071 1727204730.93046: in VariableManager get_vars() 44071 1727204730.93241: Calling all_inventory to load vars for managed-node2 44071 1727204730.93244: Calling groups_inventory to load vars for managed-node2 44071 1727204730.93246: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204730.93253: done sending task result for task 127b8e07-fff9-c964-7471-00000000232d 44071 1727204730.93255: WORKER PROCESS EXITING 44071 1727204730.93267: Calling all_plugins_play to load vars for managed-node2 44071 1727204730.93271: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204730.93274: Calling groups_plugins_play to load vars for managed-node2 44071 1727204730.97268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204731.03357: done with get_vars() 44071 1727204731.03516: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204731.03667: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:05:31 -0400 (0:00:00.260) 0:02:23.353 ***** 44071 1727204731.03703: entering _queue_task() for managed-node2/yum 44071 1727204731.04323: worker is 1 (out of 1 available) 44071 1727204731.04337: exiting _queue_task() for managed-node2/yum 44071 1727204731.04350: done queuing things up, now waiting for results queue to drain 44071 1727204731.04352: waiting for pending results... 44071 1727204731.04674: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204731.05173: in run() - task 127b8e07-fff9-c964-7471-00000000232e 44071 1727204731.05224: variable 'ansible_search_path' from source: unknown 44071 1727204731.05282: variable 'ansible_search_path' from source: unknown 44071 1727204731.05338: calling self._execute() 44071 1727204731.05658: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204731.05672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204731.05687: variable 'omit' from source: magic vars 44071 1727204731.06402: variable 'ansible_distribution_major_version' from source: facts 44071 1727204731.06430: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204731.06657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204731.09464: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204731.10073: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204731.10333: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204731.10337: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204731.10340: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204731.10451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204731.10492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204731.10525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204731.10584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204731.10604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204731.10725: variable 'ansible_distribution_major_version' from source: facts 44071 1727204731.10748: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44071 1727204731.10756: when evaluation is False, skipping this task 44071 1727204731.10778: _execute() done 44071 1727204731.10780: dumping result to json 44071 1727204731.10857: done dumping result, returning 44071 1727204731.10861: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-00000000232e] 44071 1727204731.10863: sending task result for task 127b8e07-fff9-c964-7471-00000000232e 44071 1727204731.11072: done sending task result for task 127b8e07-fff9-c964-7471-00000000232e 44071 1727204731.11076: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44071 1727204731.11134: no more pending results, returning what we have 44071 1727204731.11138: results queue empty 44071 1727204731.11139: checking for any_errors_fatal 44071 1727204731.11146: done checking for any_errors_fatal 44071 1727204731.11147: checking for max_fail_percentage 44071 1727204731.11149: done checking for max_fail_percentage 44071 1727204731.11150: checking to see if all hosts have failed and the running result is not ok 44071 1727204731.11151: done checking to see if all hosts have failed 44071 1727204731.11151: getting the remaining hosts for this loop 44071 1727204731.11153: done getting the remaining hosts for this loop 44071 1727204731.11158: getting the next task for host managed-node2 44071 1727204731.11171: done getting next task for host managed-node2 44071 1727204731.11176: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204731.11183: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204731.11217: getting variables 44071 1727204731.11219: in VariableManager get_vars() 44071 1727204731.11482: Calling all_inventory to load vars for managed-node2 44071 1727204731.11485: Calling groups_inventory to load vars for managed-node2 44071 1727204731.11488: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204731.11500: Calling all_plugins_play to load vars for managed-node2 44071 1727204731.11503: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204731.11507: Calling groups_plugins_play to load vars for managed-node2 44071 1727204731.13770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204731.16828: done with get_vars() 44071 1727204731.16875: done getting variables 44071 1727204731.16953: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:05:31 -0400 (0:00:00.132) 0:02:23.486 ***** 44071 1727204731.16993: entering _queue_task() for managed-node2/fail 44071 1727204731.17445: worker is 1 (out of 1 available) 44071 1727204731.17574: exiting _queue_task() for managed-node2/fail 44071 1727204731.17591: done queuing things up, now waiting for results queue to drain 44071 1727204731.17592: waiting for pending results... 44071 1727204731.18163: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204731.18171: in run() - task 127b8e07-fff9-c964-7471-00000000232f 44071 1727204731.18175: variable 'ansible_search_path' from source: unknown 44071 1727204731.18178: variable 'ansible_search_path' from source: unknown 44071 1727204731.18182: calling self._execute() 44071 1727204731.18241: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204731.18248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204731.18259: variable 'omit' from source: magic vars 44071 1727204731.18764: variable 'ansible_distribution_major_version' from source: facts 44071 1727204731.18840: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204731.19022: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204731.19296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204731.26670: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204731.26725: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204731.26841: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204731.26909: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204731.26938: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204731.27172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204731.27196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204731.27237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204731.27297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204731.27311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204731.27376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204731.27408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204731.27441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204731.27574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204731.27577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204731.27580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204731.27583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204731.27614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204731.27670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204731.27690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204731.27944: variable 'network_connections' from source: include params 44071 1727204731.27963: variable 'interface' from source: play vars 44071 1727204731.28160: variable 'interface' from source: play vars 44071 1727204731.28245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204731.28692: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204731.29081: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204731.29085: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204731.29087: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204731.29089: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204731.29203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204731.29242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204731.29304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204731.29378: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204731.30391: variable 'network_connections' from source: include params 44071 1727204731.30412: variable 'interface' from source: play vars 44071 1727204731.30505: variable 'interface' from source: play vars 44071 1727204731.30551: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204731.30560: when evaluation is False, skipping this task 44071 1727204731.30576: _execute() done 44071 1727204731.30628: dumping result to json 44071 1727204731.30638: done dumping result, returning 44071 1727204731.30651: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-00000000232f] 44071 1727204731.30661: sending task result for task 127b8e07-fff9-c964-7471-00000000232f skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204731.30986: no more pending results, returning what we have 44071 1727204731.30991: results queue empty 44071 1727204731.30992: checking for any_errors_fatal 44071 1727204731.31003: done checking for any_errors_fatal 44071 1727204731.31004: checking for max_fail_percentage 44071 1727204731.31006: done checking for max_fail_percentage 44071 1727204731.31007: checking to see if all hosts have failed and the running result is not ok 44071 1727204731.31008: done checking to see if all hosts have failed 44071 1727204731.31009: getting the remaining hosts for this loop 44071 1727204731.31011: done getting the remaining hosts for this loop 44071 1727204731.31017: getting the next task for host managed-node2 44071 1727204731.31027: done getting next task for host managed-node2 44071 1727204731.31033: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44071 1727204731.31039: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204731.31087: done sending task result for task 127b8e07-fff9-c964-7471-00000000232f 44071 1727204731.31091: WORKER PROCESS EXITING 44071 1727204731.31137: getting variables 44071 1727204731.31140: in VariableManager get_vars() 44071 1727204731.31333: Calling all_inventory to load vars for managed-node2 44071 1727204731.31336: Calling groups_inventory to load vars for managed-node2 44071 1727204731.31339: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204731.31353: Calling all_plugins_play to load vars for managed-node2 44071 1727204731.31357: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204731.31361: Calling groups_plugins_play to load vars for managed-node2 44071 1727204731.35491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204731.38026: done with get_vars() 44071 1727204731.38064: done getting variables 44071 1727204731.38142: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:05:31 -0400 (0:00:00.211) 0:02:23.698 ***** 44071 1727204731.38187: entering _queue_task() for managed-node2/package 44071 1727204731.38901: worker is 1 (out of 1 available) 44071 1727204731.38918: exiting _queue_task() for managed-node2/package 44071 1727204731.38933: done queuing things up, now waiting for results queue to drain 44071 1727204731.38935: waiting for pending results... 44071 1727204731.39429: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 44071 1727204731.39811: in run() - task 127b8e07-fff9-c964-7471-000000002330 44071 1727204731.39851: variable 'ansible_search_path' from source: unknown 44071 1727204731.39855: variable 'ansible_search_path' from source: unknown 44071 1727204731.39888: calling self._execute() 44071 1727204731.40068: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204731.40075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204731.40081: variable 'omit' from source: magic vars 44071 1727204731.40598: variable 'ansible_distribution_major_version' from source: facts 44071 1727204731.40630: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204731.40926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204731.41382: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204731.41536: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204731.41541: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204731.41615: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204731.41760: variable 'network_packages' from source: role '' defaults 44071 1727204731.41930: variable '__network_provider_setup' from source: role '' defaults 44071 1727204731.41949: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204731.42028: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204731.42042: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204731.42115: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204731.42326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204731.45702: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204731.45848: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204731.45868: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204731.45951: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204731.45990: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204731.46119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204731.46203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204731.46229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204731.46376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204731.46379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204731.46382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204731.46435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204731.46470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204731.46525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204731.46546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204731.46852: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204731.46996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204731.47028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204731.47071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204731.47157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204731.47160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204731.47247: variable 'ansible_python' from source: facts 44071 1727204731.47285: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204731.47440: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204731.47563: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204731.47783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204731.47910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204731.47918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204731.48050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204731.48053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204731.48114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204731.48157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204731.48218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204731.48247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204731.48276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204731.48651: variable 'network_connections' from source: include params 44071 1727204731.48657: variable 'interface' from source: play vars 44071 1727204731.48743: variable 'interface' from source: play vars 44071 1727204731.48895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204731.49124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204731.49128: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204731.49212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204731.49445: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204731.50676: variable 'network_connections' from source: include params 44071 1727204731.50680: variable 'interface' from source: play vars 44071 1727204731.50909: variable 'interface' from source: play vars 44071 1727204731.50991: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204731.51328: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204731.52277: variable 'network_connections' from source: include params 44071 1727204731.52418: variable 'interface' from source: play vars 44071 1727204731.52523: variable 'interface' from source: play vars 44071 1727204731.52614: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204731.52824: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204731.53793: variable 'network_connections' from source: include params 44071 1727204731.53873: variable 'interface' from source: play vars 44071 1727204731.53889: variable 'interface' from source: play vars 44071 1727204731.54042: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204731.54164: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204731.54240: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204731.54445: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204731.55073: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204731.56324: variable 'network_connections' from source: include params 44071 1727204731.56347: variable 'interface' from source: play vars 44071 1727204731.56496: variable 'interface' from source: play vars 44071 1727204731.56570: variable 'ansible_distribution' from source: facts 44071 1727204731.56580: variable '__network_rh_distros' from source: role '' defaults 44071 1727204731.56639: variable 'ansible_distribution_major_version' from source: facts 44071 1727204731.56666: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204731.57178: variable 'ansible_distribution' from source: facts 44071 1727204731.57195: variable '__network_rh_distros' from source: role '' defaults 44071 1727204731.57212: variable 'ansible_distribution_major_version' from source: facts 44071 1727204731.57288: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204731.57754: variable 'ansible_distribution' from source: facts 44071 1727204731.57772: variable '__network_rh_distros' from source: role '' defaults 44071 1727204731.57786: variable 'ansible_distribution_major_version' from source: facts 44071 1727204731.58078: variable 'network_provider' from source: set_fact 44071 1727204731.58081: variable 'ansible_facts' from source: unknown 44071 1727204731.60199: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44071 1727204731.60280: when evaluation is False, skipping this task 44071 1727204731.60287: _execute() done 44071 1727204731.60295: dumping result to json 44071 1727204731.60475: done dumping result, returning 44071 1727204731.60481: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-c964-7471-000000002330] 44071 1727204731.60484: sending task result for task 127b8e07-fff9-c964-7471-000000002330 44071 1727204731.60571: done sending task result for task 127b8e07-fff9-c964-7471-000000002330 44071 1727204731.60575: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44071 1727204731.60709: no more pending results, returning what we have 44071 1727204731.60713: results queue empty 44071 1727204731.60714: checking for any_errors_fatal 44071 1727204731.60722: done checking for any_errors_fatal 44071 1727204731.60723: checking for max_fail_percentage 44071 1727204731.60725: done checking for max_fail_percentage 44071 1727204731.60726: checking to see if all hosts have failed and the running result is not ok 44071 1727204731.60727: done checking to see if all hosts have failed 44071 1727204731.60728: getting the remaining hosts for this loop 44071 1727204731.60729: done getting the remaining hosts for this loop 44071 1727204731.60735: getting the next task for host managed-node2 44071 1727204731.60745: done getting next task for host managed-node2 44071 1727204731.60750: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204731.60757: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204731.60787: getting variables 44071 1727204731.60789: in VariableManager get_vars() 44071 1727204731.61117: Calling all_inventory to load vars for managed-node2 44071 1727204731.61120: Calling groups_inventory to load vars for managed-node2 44071 1727204731.61122: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204731.61134: Calling all_plugins_play to load vars for managed-node2 44071 1727204731.61138: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204731.61142: Calling groups_plugins_play to load vars for managed-node2 44071 1727204731.66001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204731.71955: done with get_vars() 44071 1727204731.72008: done getting variables 44071 1727204731.72398: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:05:31 -0400 (0:00:00.342) 0:02:24.040 ***** 44071 1727204731.72441: entering _queue_task() for managed-node2/package 44071 1727204731.73393: worker is 1 (out of 1 available) 44071 1727204731.73407: exiting _queue_task() for managed-node2/package 44071 1727204731.73421: done queuing things up, now waiting for results queue to drain 44071 1727204731.73423: waiting for pending results... 44071 1727204731.74090: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204731.74434: in run() - task 127b8e07-fff9-c964-7471-000000002331 44071 1727204731.74459: variable 'ansible_search_path' from source: unknown 44071 1727204731.74471: variable 'ansible_search_path' from source: unknown 44071 1727204731.74651: calling self._execute() 44071 1727204731.74878: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204731.74893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204731.74909: variable 'omit' from source: magic vars 44071 1727204731.75847: variable 'ansible_distribution_major_version' from source: facts 44071 1727204731.75852: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204731.76126: variable 'network_state' from source: role '' defaults 44071 1727204731.76293: Evaluated conditional (network_state != {}): False 44071 1727204731.76408: when evaluation is False, skipping this task 44071 1727204731.76413: _execute() done 44071 1727204731.76416: dumping result to json 44071 1727204731.76419: done dumping result, returning 44071 1727204731.76422: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-c964-7471-000000002331] 44071 1727204731.76425: sending task result for task 127b8e07-fff9-c964-7471-000000002331 44071 1727204731.76518: done sending task result for task 127b8e07-fff9-c964-7471-000000002331 44071 1727204731.76522: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204731.76583: no more pending results, returning what we have 44071 1727204731.76588: results queue empty 44071 1727204731.76589: checking for any_errors_fatal 44071 1727204731.76600: done checking for any_errors_fatal 44071 1727204731.76601: checking for max_fail_percentage 44071 1727204731.76603: done checking for max_fail_percentage 44071 1727204731.76604: checking to see if all hosts have failed and the running result is not ok 44071 1727204731.76605: done checking to see if all hosts have failed 44071 1727204731.76606: getting the remaining hosts for this loop 44071 1727204731.76607: done getting the remaining hosts for this loop 44071 1727204731.76613: getting the next task for host managed-node2 44071 1727204731.76623: done getting next task for host managed-node2 44071 1727204731.76627: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204731.76635: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204731.76670: getting variables 44071 1727204731.76673: in VariableManager get_vars() 44071 1727204731.76732: Calling all_inventory to load vars for managed-node2 44071 1727204731.76735: Calling groups_inventory to load vars for managed-node2 44071 1727204731.76738: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204731.76753: Calling all_plugins_play to load vars for managed-node2 44071 1727204731.76756: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204731.76760: Calling groups_plugins_play to load vars for managed-node2 44071 1727204731.82150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204731.87350: done with get_vars() 44071 1727204731.87618: done getting variables 44071 1727204731.87804: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:05:31 -0400 (0:00:00.154) 0:02:24.194 ***** 44071 1727204731.87847: entering _queue_task() for managed-node2/package 44071 1727204731.89073: worker is 1 (out of 1 available) 44071 1727204731.89088: exiting _queue_task() for managed-node2/package 44071 1727204731.89102: done queuing things up, now waiting for results queue to drain 44071 1727204731.89109: waiting for pending results... 44071 1727204731.89733: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204731.89944: in run() - task 127b8e07-fff9-c964-7471-000000002332 44071 1727204731.89962: variable 'ansible_search_path' from source: unknown 44071 1727204731.89968: variable 'ansible_search_path' from source: unknown 44071 1727204731.90170: calling self._execute() 44071 1727204731.90734: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204731.90739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204731.90742: variable 'omit' from source: magic vars 44071 1727204731.91583: variable 'ansible_distribution_major_version' from source: facts 44071 1727204731.91608: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204731.91770: variable 'network_state' from source: role '' defaults 44071 1727204731.91849: Evaluated conditional (network_state != {}): False 44071 1727204731.91853: when evaluation is False, skipping this task 44071 1727204731.91857: _execute() done 44071 1727204731.91862: dumping result to json 44071 1727204731.91865: done dumping result, returning 44071 1727204731.91868: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-c964-7471-000000002332] 44071 1727204731.91874: sending task result for task 127b8e07-fff9-c964-7471-000000002332 44071 1727204731.92104: done sending task result for task 127b8e07-fff9-c964-7471-000000002332 44071 1727204731.92108: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204731.92200: no more pending results, returning what we have 44071 1727204731.92204: results queue empty 44071 1727204731.92206: checking for any_errors_fatal 44071 1727204731.92215: done checking for any_errors_fatal 44071 1727204731.92216: checking for max_fail_percentage 44071 1727204731.92218: done checking for max_fail_percentage 44071 1727204731.92219: checking to see if all hosts have failed and the running result is not ok 44071 1727204731.92220: done checking to see if all hosts have failed 44071 1727204731.92221: getting the remaining hosts for this loop 44071 1727204731.92223: done getting the remaining hosts for this loop 44071 1727204731.92229: getting the next task for host managed-node2 44071 1727204731.92240: done getting next task for host managed-node2 44071 1727204731.92245: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204731.92253: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204731.92379: getting variables 44071 1727204731.92382: in VariableManager get_vars() 44071 1727204731.92541: Calling all_inventory to load vars for managed-node2 44071 1727204731.92545: Calling groups_inventory to load vars for managed-node2 44071 1727204731.92548: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204731.92560: Calling all_plugins_play to load vars for managed-node2 44071 1727204731.92564: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204731.92570: Calling groups_plugins_play to load vars for managed-node2 44071 1727204731.95647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204731.98363: done with get_vars() 44071 1727204731.98406: done getting variables 44071 1727204731.98479: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:05:31 -0400 (0:00:00.106) 0:02:24.301 ***** 44071 1727204731.98525: entering _queue_task() for managed-node2/service 44071 1727204731.99080: worker is 1 (out of 1 available) 44071 1727204731.99094: exiting _queue_task() for managed-node2/service 44071 1727204731.99109: done queuing things up, now waiting for results queue to drain 44071 1727204731.99110: waiting for pending results... 44071 1727204731.99461: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204731.99673: in run() - task 127b8e07-fff9-c964-7471-000000002333 44071 1727204731.99677: variable 'ansible_search_path' from source: unknown 44071 1727204731.99679: variable 'ansible_search_path' from source: unknown 44071 1727204731.99682: calling self._execute() 44071 1727204731.99776: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204731.99790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204731.99813: variable 'omit' from source: magic vars 44071 1727204732.00422: variable 'ansible_distribution_major_version' from source: facts 44071 1727204732.00445: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204732.00613: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204732.00858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204732.04826: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204732.04918: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204732.05090: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204732.05137: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204732.05228: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204732.05438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204732.05569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204732.05575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204732.05674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204732.05710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204732.05908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204732.05937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204732.06028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204732.06106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204732.06201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204732.06448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204732.06451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204732.06474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204732.06585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204732.06724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204732.07109: variable 'network_connections' from source: include params 44071 1727204732.07133: variable 'interface' from source: play vars 44071 1727204732.07249: variable 'interface' from source: play vars 44071 1727204732.07359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204732.07589: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204732.07677: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204732.07730: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204732.07785: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204732.07876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204732.07924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204732.08034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204732.08039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204732.08105: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204732.08499: variable 'network_connections' from source: include params 44071 1727204732.08517: variable 'interface' from source: play vars 44071 1727204732.08628: variable 'interface' from source: play vars 44071 1727204732.08659: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204732.08671: when evaluation is False, skipping this task 44071 1727204732.08701: _execute() done 44071 1727204732.08704: dumping result to json 44071 1727204732.08707: done dumping result, returning 44071 1727204732.08794: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000002333] 44071 1727204732.08797: sending task result for task 127b8e07-fff9-c964-7471-000000002333 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204732.09124: no more pending results, returning what we have 44071 1727204732.09129: results queue empty 44071 1727204732.09130: checking for any_errors_fatal 44071 1727204732.09142: done checking for any_errors_fatal 44071 1727204732.09143: checking for max_fail_percentage 44071 1727204732.09146: done checking for max_fail_percentage 44071 1727204732.09147: checking to see if all hosts have failed and the running result is not ok 44071 1727204732.09148: done checking to see if all hosts have failed 44071 1727204732.09149: getting the remaining hosts for this loop 44071 1727204732.09151: done getting the remaining hosts for this loop 44071 1727204732.09156: getting the next task for host managed-node2 44071 1727204732.09169: done getting next task for host managed-node2 44071 1727204732.09175: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204732.09181: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204732.09213: getting variables 44071 1727204732.09217: in VariableManager get_vars() 44071 1727204732.09318: Calling all_inventory to load vars for managed-node2 44071 1727204732.09321: Calling groups_inventory to load vars for managed-node2 44071 1727204732.09323: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204732.09336: Calling all_plugins_play to load vars for managed-node2 44071 1727204732.09454: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204732.09461: done sending task result for task 127b8e07-fff9-c964-7471-000000002333 44071 1727204732.09464: WORKER PROCESS EXITING 44071 1727204732.09472: Calling groups_plugins_play to load vars for managed-node2 44071 1727204732.12941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204732.15489: done with get_vars() 44071 1727204732.15533: done getting variables 44071 1727204732.15607: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:05:32 -0400 (0:00:00.171) 0:02:24.472 ***** 44071 1727204732.15646: entering _queue_task() for managed-node2/service 44071 1727204732.16075: worker is 1 (out of 1 available) 44071 1727204732.16090: exiting _queue_task() for managed-node2/service 44071 1727204732.16216: done queuing things up, now waiting for results queue to drain 44071 1727204732.16218: waiting for pending results... 44071 1727204732.16568: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204732.16762: in run() - task 127b8e07-fff9-c964-7471-000000002334 44071 1727204732.16792: variable 'ansible_search_path' from source: unknown 44071 1727204732.16802: variable 'ansible_search_path' from source: unknown 44071 1727204732.16852: calling self._execute() 44071 1727204732.16998: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204732.17047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204732.17064: variable 'omit' from source: magic vars 44071 1727204732.17558: variable 'ansible_distribution_major_version' from source: facts 44071 1727204732.17583: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204732.17799: variable 'network_provider' from source: set_fact 44071 1727204732.17834: variable 'network_state' from source: role '' defaults 44071 1727204732.17837: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44071 1727204732.17841: variable 'omit' from source: magic vars 44071 1727204732.17926: variable 'omit' from source: magic vars 44071 1727204732.17972: variable 'network_service_name' from source: role '' defaults 44071 1727204732.18072: variable 'network_service_name' from source: role '' defaults 44071 1727204732.18199: variable '__network_provider_setup' from source: role '' defaults 44071 1727204732.18269: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204732.18292: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204732.18310: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204732.18388: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204732.18679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204732.21426: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204732.22016: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204732.22093: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204732.22123: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204732.22156: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204732.22310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204732.22316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204732.22332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204732.22391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204732.22429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204732.22527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204732.22531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204732.22540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204732.22711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204732.22715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204732.23014: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204732.23382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204732.23416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204732.23594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204732.23597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204732.23627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204732.23879: variable 'ansible_python' from source: facts 44071 1727204732.23905: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204732.24138: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204732.24301: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204732.24628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204732.24709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204732.24823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204732.24905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204732.24927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204732.25171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204732.25194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204732.25230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204732.25355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204732.25381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204732.25716: variable 'network_connections' from source: include params 44071 1727204732.25786: variable 'interface' from source: play vars 44071 1727204732.25982: variable 'interface' from source: play vars 44071 1727204732.26465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204732.26690: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204732.26949: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204732.27005: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204732.27187: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204732.27395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204732.27437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204732.27678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204732.27681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204732.27798: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204732.28541: variable 'network_connections' from source: include params 44071 1727204732.28546: variable 'interface' from source: play vars 44071 1727204732.28641: variable 'interface' from source: play vars 44071 1727204732.28810: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204732.28894: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204732.29813: variable 'network_connections' from source: include params 44071 1727204732.29817: variable 'interface' from source: play vars 44071 1727204732.29881: variable 'interface' from source: play vars 44071 1727204732.29908: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204732.30209: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204732.30962: variable 'network_connections' from source: include params 44071 1727204732.30967: variable 'interface' from source: play vars 44071 1727204732.31172: variable 'interface' from source: play vars 44071 1727204732.31371: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204732.31393: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204732.31400: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204732.31468: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204732.32045: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204732.32952: variable 'network_connections' from source: include params 44071 1727204732.32956: variable 'interface' from source: play vars 44071 1727204732.33040: variable 'interface' from source: play vars 44071 1727204732.33043: variable 'ansible_distribution' from source: facts 44071 1727204732.33049: variable '__network_rh_distros' from source: role '' defaults 44071 1727204732.33056: variable 'ansible_distribution_major_version' from source: facts 44071 1727204732.33077: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204732.33355: variable 'ansible_distribution' from source: facts 44071 1727204732.33362: variable '__network_rh_distros' from source: role '' defaults 44071 1727204732.33371: variable 'ansible_distribution_major_version' from source: facts 44071 1727204732.33378: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204732.33794: variable 'ansible_distribution' from source: facts 44071 1727204732.33797: variable '__network_rh_distros' from source: role '' defaults 44071 1727204732.33962: variable 'ansible_distribution_major_version' from source: facts 44071 1727204732.33968: variable 'network_provider' from source: set_fact 44071 1727204732.33970: variable 'omit' from source: magic vars 44071 1727204732.34203: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204732.34213: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204732.34311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204732.34315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204732.34317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204732.34529: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204732.34536: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204732.34539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204732.34616: Set connection var ansible_connection to ssh 44071 1727204732.34623: Set connection var ansible_timeout to 10 44071 1727204732.34743: Set connection var ansible_pipelining to False 44071 1727204732.34747: Set connection var ansible_shell_type to sh 44071 1727204732.34749: Set connection var ansible_shell_executable to /bin/sh 44071 1727204732.34751: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204732.34981: variable 'ansible_shell_executable' from source: unknown 44071 1727204732.34985: variable 'ansible_connection' from source: unknown 44071 1727204732.34987: variable 'ansible_module_compression' from source: unknown 44071 1727204732.34990: variable 'ansible_shell_type' from source: unknown 44071 1727204732.34992: variable 'ansible_shell_executable' from source: unknown 44071 1727204732.34997: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204732.35002: variable 'ansible_pipelining' from source: unknown 44071 1727204732.35004: variable 'ansible_timeout' from source: unknown 44071 1727204732.35010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204732.35239: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204732.35249: variable 'omit' from source: magic vars 44071 1727204732.35285: starting attempt loop 44071 1727204732.35291: running the handler 44071 1727204732.35874: variable 'ansible_facts' from source: unknown 44071 1727204732.37619: _low_level_execute_command(): starting 44071 1727204732.37625: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204732.39250: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204732.39480: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204732.39485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204732.39571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204732.39667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204732.41655: stdout chunk (state=3): >>>/root <<< 44071 1727204732.41660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204732.41703: stderr chunk (state=3): >>><<< 44071 1727204732.41707: stdout chunk (state=3): >>><<< 44071 1727204732.41869: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204732.41887: _low_level_execute_command(): starting 44071 1727204732.41894: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204732.4187164-52496-14104153701738 `" && echo ansible-tmp-1727204732.4187164-52496-14104153701738="` echo /root/.ansible/tmp/ansible-tmp-1727204732.4187164-52496-14104153701738 `" ) && sleep 0' 44071 1727204732.43894: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204732.44048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204732.44152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204732.44477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204732.44497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204732.44599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204732.46672: stdout chunk (state=3): >>>ansible-tmp-1727204732.4187164-52496-14104153701738=/root/.ansible/tmp/ansible-tmp-1727204732.4187164-52496-14104153701738 <<< 44071 1727204732.46884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204732.46890: stderr chunk (state=3): >>><<< 44071 1727204732.46895: stdout chunk (state=3): >>><<< 44071 1727204732.46920: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204732.4187164-52496-14104153701738=/root/.ansible/tmp/ansible-tmp-1727204732.4187164-52496-14104153701738 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204732.46959: variable 'ansible_module_compression' from source: unknown 44071 1727204732.47320: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44071 1727204732.47505: variable 'ansible_facts' from source: unknown 44071 1727204732.48047: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204732.4187164-52496-14104153701738/AnsiballZ_systemd.py 44071 1727204732.48631: Sending initial data 44071 1727204732.48878: Sent initial data (155 bytes) 44071 1727204732.50068: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204732.50171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204732.50176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204732.50186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204732.50484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204732.52070: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 44071 1727204732.52075: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204732.52260: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204732.52264: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204732.4187164-52496-14104153701738/AnsiballZ_systemd.py" <<< 44071 1727204732.52268: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp3y8g6qma /root/.ansible/tmp/ansible-tmp-1727204732.4187164-52496-14104153701738/AnsiballZ_systemd.py <<< 44071 1727204732.52379: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp3y8g6qma" to remote "/root/.ansible/tmp/ansible-tmp-1727204732.4187164-52496-14104153701738/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204732.4187164-52496-14104153701738/AnsiballZ_systemd.py" <<< 44071 1727204732.55267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204732.55454: stderr chunk (state=3): >>><<< 44071 1727204732.55460: stdout chunk (state=3): >>><<< 44071 1727204732.55463: done transferring module to remote 44071 1727204732.55465: _low_level_execute_command(): starting 44071 1727204732.55469: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204732.4187164-52496-14104153701738/ /root/.ansible/tmp/ansible-tmp-1727204732.4187164-52496-14104153701738/AnsiballZ_systemd.py && sleep 0' 44071 1727204732.56761: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204732.56925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204732.56929: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204732.56946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204732.56958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204732.57044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204732.57151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204732.59204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204732.59209: stderr chunk (state=3): >>><<< 44071 1727204732.59212: stdout chunk (state=3): >>><<< 44071 1727204732.59215: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204732.59218: _low_level_execute_command(): starting 44071 1727204732.59220: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204732.4187164-52496-14104153701738/AnsiballZ_systemd.py && sleep 0' 44071 1727204732.60630: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204732.60807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204732.60859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204732.61105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204732.92843: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4603904", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3513372672", "CPUUsageNSec": "1712633000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitC<<< 44071 1727204732.92888: stdout chunk (state=3): >>>ORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44071 1727204732.94922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204732.94942: stdout chunk (state=3): >>><<< 44071 1727204732.94963: stderr chunk (state=3): >>><<< 44071 1727204732.95022: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4603904", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3513372672", "CPUUsageNSec": "1712633000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204732.95520: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204732.4187164-52496-14104153701738/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204732.95524: _low_level_execute_command(): starting 44071 1727204732.95527: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204732.4187164-52496-14104153701738/ > /dev/null 2>&1 && sleep 0' 44071 1727204732.96565: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204732.96691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204732.96715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204732.96826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204732.98904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204732.98908: stdout chunk (state=3): >>><<< 44071 1727204732.98912: stderr chunk (state=3): >>><<< 44071 1727204732.99074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204732.99080: handler run complete 44071 1727204732.99143: attempt loop complete, returning result 44071 1727204732.99153: _execute() done 44071 1727204732.99170: dumping result to json 44071 1727204732.99209: done dumping result, returning 44071 1727204732.99218: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-c964-7471-000000002334] 44071 1727204732.99318: sending task result for task 127b8e07-fff9-c964-7471-000000002334 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204733.00231: no more pending results, returning what we have 44071 1727204733.00235: results queue empty 44071 1727204733.00236: checking for any_errors_fatal 44071 1727204733.00242: done checking for any_errors_fatal 44071 1727204733.00243: checking for max_fail_percentage 44071 1727204733.00245: done checking for max_fail_percentage 44071 1727204733.00246: checking to see if all hosts have failed and the running result is not ok 44071 1727204733.00247: done checking to see if all hosts have failed 44071 1727204733.00248: getting the remaining hosts for this loop 44071 1727204733.00249: done getting the remaining hosts for this loop 44071 1727204733.00262: getting the next task for host managed-node2 44071 1727204733.00273: done getting next task for host managed-node2 44071 1727204733.00277: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204733.00284: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204733.00325: done sending task result for task 127b8e07-fff9-c964-7471-000000002334 44071 1727204733.00328: WORKER PROCESS EXITING 44071 1727204733.00339: getting variables 44071 1727204733.00341: in VariableManager get_vars() 44071 1727204733.00561: Calling all_inventory to load vars for managed-node2 44071 1727204733.00564: Calling groups_inventory to load vars for managed-node2 44071 1727204733.00570: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204733.00622: Calling all_plugins_play to load vars for managed-node2 44071 1727204733.00648: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204733.00658: Calling groups_plugins_play to load vars for managed-node2 44071 1727204733.04348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204733.07186: done with get_vars() 44071 1727204733.07259: done getting variables 44071 1727204733.07336: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:05:33 -0400 (0:00:00.917) 0:02:25.390 ***** 44071 1727204733.07386: entering _queue_task() for managed-node2/service 44071 1727204733.08200: worker is 1 (out of 1 available) 44071 1727204733.08213: exiting _queue_task() for managed-node2/service 44071 1727204733.08227: done queuing things up, now waiting for results queue to drain 44071 1727204733.08229: waiting for pending results... 44071 1727204733.08548: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204733.08825: in run() - task 127b8e07-fff9-c964-7471-000000002335 44071 1727204733.08853: variable 'ansible_search_path' from source: unknown 44071 1727204733.08863: variable 'ansible_search_path' from source: unknown 44071 1727204733.08923: calling self._execute() 44071 1727204733.09049: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204733.09120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204733.09245: variable 'omit' from source: magic vars 44071 1727204733.09821: variable 'ansible_distribution_major_version' from source: facts 44071 1727204733.09846: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204733.10048: variable 'network_provider' from source: set_fact 44071 1727204733.10099: Evaluated conditional (network_provider == "nm"): True 44071 1727204733.10270: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204733.10519: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204733.10772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204733.13794: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204733.13901: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204733.14009: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204733.14012: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204733.14024: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204733.14142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204733.14186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204733.14220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204733.14279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204733.14338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204733.14366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204733.14404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204733.14437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204733.14495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204733.14517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204733.14706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204733.14710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204733.14713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204733.14764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204733.14852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204733.15012: variable 'network_connections' from source: include params 44071 1727204733.15032: variable 'interface' from source: play vars 44071 1727204733.15157: variable 'interface' from source: play vars 44071 1727204733.15250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204733.15511: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204733.15591: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204733.15698: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204733.15702: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204733.15729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204733.15760: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204733.15795: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204733.15889: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204733.15996: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204733.16401: variable 'network_connections' from source: include params 44071 1727204733.16414: variable 'interface' from source: play vars 44071 1727204733.16498: variable 'interface' from source: play vars 44071 1727204733.16537: Evaluated conditional (__network_wpa_supplicant_required): False 44071 1727204733.16568: when evaluation is False, skipping this task 44071 1727204733.16573: _execute() done 44071 1727204733.16575: dumping result to json 44071 1727204733.16577: done dumping result, returning 44071 1727204733.16676: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-c964-7471-000000002335] 44071 1727204733.16689: sending task result for task 127b8e07-fff9-c964-7471-000000002335 44071 1727204733.16771: done sending task result for task 127b8e07-fff9-c964-7471-000000002335 44071 1727204733.16774: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44071 1727204733.16831: no more pending results, returning what we have 44071 1727204733.16835: results queue empty 44071 1727204733.16836: checking for any_errors_fatal 44071 1727204733.16873: done checking for any_errors_fatal 44071 1727204733.16874: checking for max_fail_percentage 44071 1727204733.16876: done checking for max_fail_percentage 44071 1727204733.16877: checking to see if all hosts have failed and the running result is not ok 44071 1727204733.16878: done checking to see if all hosts have failed 44071 1727204733.16878: getting the remaining hosts for this loop 44071 1727204733.16880: done getting the remaining hosts for this loop 44071 1727204733.16886: getting the next task for host managed-node2 44071 1727204733.16895: done getting next task for host managed-node2 44071 1727204733.16899: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204733.16904: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204733.16932: getting variables 44071 1727204733.16935: in VariableManager get_vars() 44071 1727204733.17142: Calling all_inventory to load vars for managed-node2 44071 1727204733.17145: Calling groups_inventory to load vars for managed-node2 44071 1727204733.17148: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204733.17162: Calling all_plugins_play to load vars for managed-node2 44071 1727204733.17168: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204733.17173: Calling groups_plugins_play to load vars for managed-node2 44071 1727204733.20832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204733.24350: done with get_vars() 44071 1727204733.24414: done getting variables 44071 1727204733.24495: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:05:33 -0400 (0:00:00.172) 0:02:25.562 ***** 44071 1727204733.24592: entering _queue_task() for managed-node2/service 44071 1727204733.25190: worker is 1 (out of 1 available) 44071 1727204733.25204: exiting _queue_task() for managed-node2/service 44071 1727204733.25225: done queuing things up, now waiting for results queue to drain 44071 1727204733.25227: waiting for pending results... 44071 1727204733.25968: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204733.26881: in run() - task 127b8e07-fff9-c964-7471-000000002336 44071 1727204733.26888: variable 'ansible_search_path' from source: unknown 44071 1727204733.26891: variable 'ansible_search_path' from source: unknown 44071 1727204733.26894: calling self._execute() 44071 1727204733.27218: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204733.27233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204733.27250: variable 'omit' from source: magic vars 44071 1727204733.28282: variable 'ansible_distribution_major_version' from source: facts 44071 1727204733.28306: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204733.28459: variable 'network_provider' from source: set_fact 44071 1727204733.28478: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204733.28493: when evaluation is False, skipping this task 44071 1727204733.28501: _execute() done 44071 1727204733.28509: dumping result to json 44071 1727204733.28517: done dumping result, returning 44071 1727204733.28529: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-c964-7471-000000002336] 44071 1727204733.28538: sending task result for task 127b8e07-fff9-c964-7471-000000002336 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204733.28933: no more pending results, returning what we have 44071 1727204733.28938: results queue empty 44071 1727204733.28939: checking for any_errors_fatal 44071 1727204733.28950: done checking for any_errors_fatal 44071 1727204733.28950: checking for max_fail_percentage 44071 1727204733.28952: done checking for max_fail_percentage 44071 1727204733.28953: checking to see if all hosts have failed and the running result is not ok 44071 1727204733.28954: done checking to see if all hosts have failed 44071 1727204733.28955: getting the remaining hosts for this loop 44071 1727204733.28957: done getting the remaining hosts for this loop 44071 1727204733.28962: getting the next task for host managed-node2 44071 1727204733.28974: done getting next task for host managed-node2 44071 1727204733.28978: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204733.28984: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204733.29018: getting variables 44071 1727204733.29020: in VariableManager get_vars() 44071 1727204733.29581: Calling all_inventory to load vars for managed-node2 44071 1727204733.29586: Calling groups_inventory to load vars for managed-node2 44071 1727204733.29589: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204733.29601: Calling all_plugins_play to load vars for managed-node2 44071 1727204733.29604: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204733.29607: Calling groups_plugins_play to load vars for managed-node2 44071 1727204733.30317: done sending task result for task 127b8e07-fff9-c964-7471-000000002336 44071 1727204733.30323: WORKER PROCESS EXITING 44071 1727204733.34202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204733.40637: done with get_vars() 44071 1727204733.40735: done getting variables 44071 1727204733.40922: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:05:33 -0400 (0:00:00.164) 0:02:25.726 ***** 44071 1727204733.41023: entering _queue_task() for managed-node2/copy 44071 1727204733.41494: worker is 1 (out of 1 available) 44071 1727204733.41510: exiting _queue_task() for managed-node2/copy 44071 1727204733.41525: done queuing things up, now waiting for results queue to drain 44071 1727204733.41527: waiting for pending results... 44071 1727204733.41913: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204733.42178: in run() - task 127b8e07-fff9-c964-7471-000000002337 44071 1727204733.42194: variable 'ansible_search_path' from source: unknown 44071 1727204733.42198: variable 'ansible_search_path' from source: unknown 44071 1727204733.42331: calling self._execute() 44071 1727204733.42409: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204733.42419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204733.42431: variable 'omit' from source: magic vars 44071 1727204733.43525: variable 'ansible_distribution_major_version' from source: facts 44071 1727204733.43542: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204733.43814: variable 'network_provider' from source: set_fact 44071 1727204733.43819: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204733.43827: when evaluation is False, skipping this task 44071 1727204733.43958: _execute() done 44071 1727204733.43963: dumping result to json 44071 1727204733.43967: done dumping result, returning 44071 1727204733.44071: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-c964-7471-000000002337] 44071 1727204733.44075: sending task result for task 127b8e07-fff9-c964-7471-000000002337 44071 1727204733.44173: done sending task result for task 127b8e07-fff9-c964-7471-000000002337 44071 1727204733.44177: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44071 1727204733.44239: no more pending results, returning what we have 44071 1727204733.44245: results queue empty 44071 1727204733.44246: checking for any_errors_fatal 44071 1727204733.44303: done checking for any_errors_fatal 44071 1727204733.44304: checking for max_fail_percentage 44071 1727204733.44306: done checking for max_fail_percentage 44071 1727204733.44308: checking to see if all hosts have failed and the running result is not ok 44071 1727204733.44309: done checking to see if all hosts have failed 44071 1727204733.44310: getting the remaining hosts for this loop 44071 1727204733.44312: done getting the remaining hosts for this loop 44071 1727204733.44319: getting the next task for host managed-node2 44071 1727204733.44330: done getting next task for host managed-node2 44071 1727204733.44337: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204733.44345: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204733.44382: getting variables 44071 1727204733.44384: in VariableManager get_vars() 44071 1727204733.44578: Calling all_inventory to load vars for managed-node2 44071 1727204733.44582: Calling groups_inventory to load vars for managed-node2 44071 1727204733.44585: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204733.44599: Calling all_plugins_play to load vars for managed-node2 44071 1727204733.44603: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204733.44607: Calling groups_plugins_play to load vars for managed-node2 44071 1727204733.47967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204733.51996: done with get_vars() 44071 1727204733.52182: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:05:33 -0400 (0:00:00.113) 0:02:25.839 ***** 44071 1727204733.52349: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204733.53299: worker is 1 (out of 1 available) 44071 1727204733.53316: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204733.53449: done queuing things up, now waiting for results queue to drain 44071 1727204733.53452: waiting for pending results... 44071 1727204733.54318: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204733.54473: in run() - task 127b8e07-fff9-c964-7471-000000002338 44071 1727204733.54545: variable 'ansible_search_path' from source: unknown 44071 1727204733.54554: variable 'ansible_search_path' from source: unknown 44071 1727204733.54606: calling self._execute() 44071 1727204733.55063: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204733.55070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204733.55073: variable 'omit' from source: magic vars 44071 1727204733.55822: variable 'ansible_distribution_major_version' from source: facts 44071 1727204733.55963: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204733.55982: variable 'omit' from source: magic vars 44071 1727204733.56375: variable 'omit' from source: magic vars 44071 1727204733.56629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204733.62246: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204733.62485: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204733.62644: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204733.62882: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204733.62886: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204733.63146: variable 'network_provider' from source: set_fact 44071 1727204733.63678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204733.63683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204733.63904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204733.63908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204733.63911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204733.64179: variable 'omit' from source: magic vars 44071 1727204733.64662: variable 'omit' from source: magic vars 44071 1727204733.64760: variable 'network_connections' from source: include params 44071 1727204733.64889: variable 'interface' from source: play vars 44071 1727204733.65072: variable 'interface' from source: play vars 44071 1727204733.65339: variable 'omit' from source: magic vars 44071 1727204733.65471: variable '__lsr_ansible_managed' from source: task vars 44071 1727204733.65656: variable '__lsr_ansible_managed' from source: task vars 44071 1727204733.66443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44071 1727204733.66832: Loaded config def from plugin (lookup/template) 44071 1727204733.66852: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44071 1727204733.66978: File lookup term: get_ansible_managed.j2 44071 1727204733.67060: variable 'ansible_search_path' from source: unknown 44071 1727204733.67067: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44071 1727204733.67072: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44071 1727204733.67075: variable 'ansible_search_path' from source: unknown 44071 1727204733.93407: variable 'ansible_managed' from source: unknown 44071 1727204733.93674: variable 'omit' from source: magic vars 44071 1727204733.93709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204733.93751: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204733.93793: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204733.93840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204733.93854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204733.93963: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204733.93969: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204733.93972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204733.94156: Set connection var ansible_connection to ssh 44071 1727204733.94162: Set connection var ansible_timeout to 10 44071 1727204733.94173: Set connection var ansible_pipelining to False 44071 1727204733.94179: Set connection var ansible_shell_type to sh 44071 1727204733.94185: Set connection var ansible_shell_executable to /bin/sh 44071 1727204733.94204: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204733.94245: variable 'ansible_shell_executable' from source: unknown 44071 1727204733.94248: variable 'ansible_connection' from source: unknown 44071 1727204733.94250: variable 'ansible_module_compression' from source: unknown 44071 1727204733.94253: variable 'ansible_shell_type' from source: unknown 44071 1727204733.94255: variable 'ansible_shell_executable' from source: unknown 44071 1727204733.94372: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204733.94376: variable 'ansible_pipelining' from source: unknown 44071 1727204733.94381: variable 'ansible_timeout' from source: unknown 44071 1727204733.94384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204733.94580: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204733.94779: variable 'omit' from source: magic vars 44071 1727204733.94783: starting attempt loop 44071 1727204733.94785: running the handler 44071 1727204733.94787: _low_level_execute_command(): starting 44071 1727204733.94790: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204733.95552: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204733.95558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204733.95561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204733.95563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204733.95622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204733.95630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204733.95631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204733.95704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204733.97469: stdout chunk (state=3): >>>/root <<< 44071 1727204733.97577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204733.97645: stderr chunk (state=3): >>><<< 44071 1727204733.97649: stdout chunk (state=3): >>><<< 44071 1727204733.97669: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204733.97680: _low_level_execute_command(): starting 44071 1727204733.97687: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204733.9767044-52624-15359219767932 `" && echo ansible-tmp-1727204733.9767044-52624-15359219767932="` echo /root/.ansible/tmp/ansible-tmp-1727204733.9767044-52624-15359219767932 `" ) && sleep 0' 44071 1727204733.98183: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204733.98307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204733.98314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204733.98401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204734.00395: stdout chunk (state=3): >>>ansible-tmp-1727204733.9767044-52624-15359219767932=/root/.ansible/tmp/ansible-tmp-1727204733.9767044-52624-15359219767932 <<< 44071 1727204734.00581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204734.00585: stderr chunk (state=3): >>><<< 44071 1727204734.00587: stdout chunk (state=3): >>><<< 44071 1727204734.00623: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204733.9767044-52624-15359219767932=/root/.ansible/tmp/ansible-tmp-1727204733.9767044-52624-15359219767932 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204734.00677: variable 'ansible_module_compression' from source: unknown 44071 1727204734.00733: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44071 1727204734.00789: variable 'ansible_facts' from source: unknown 44071 1727204734.00910: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204733.9767044-52624-15359219767932/AnsiballZ_network_connections.py 44071 1727204734.01083: Sending initial data 44071 1727204734.01089: Sent initial data (167 bytes) 44071 1727204734.01769: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204734.01774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204734.01783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204734.01786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204734.01840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204734.01843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204734.01939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204734.03556: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204734.03625: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204734.03705: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpjgtfrrw9 /root/.ansible/tmp/ansible-tmp-1727204733.9767044-52624-15359219767932/AnsiballZ_network_connections.py <<< 44071 1727204734.03709: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204733.9767044-52624-15359219767932/AnsiballZ_network_connections.py" <<< 44071 1727204734.03759: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpjgtfrrw9" to remote "/root/.ansible/tmp/ansible-tmp-1727204733.9767044-52624-15359219767932/AnsiballZ_network_connections.py" <<< 44071 1727204734.03762: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204733.9767044-52624-15359219767932/AnsiballZ_network_connections.py" <<< 44071 1727204734.04786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204734.04913: stderr chunk (state=3): >>><<< 44071 1727204734.04918: stdout chunk (state=3): >>><<< 44071 1727204734.04948: done transferring module to remote 44071 1727204734.04970: _low_level_execute_command(): starting 44071 1727204734.05042: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204733.9767044-52624-15359219767932/ /root/.ansible/tmp/ansible-tmp-1727204733.9767044-52624-15359219767932/AnsiballZ_network_connections.py && sleep 0' 44071 1727204734.05650: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204734.05654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204734.05657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204734.05664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204734.05715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204734.05730: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204734.05814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204734.07639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204734.07704: stderr chunk (state=3): >>><<< 44071 1727204734.07707: stdout chunk (state=3): >>><<< 44071 1727204734.07722: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204734.07725: _low_level_execute_command(): starting 44071 1727204734.07730: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204733.9767044-52624-15359219767932/AnsiballZ_network_connections.py && sleep 0' 44071 1727204734.08240: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204734.08244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204734.08247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204734.08251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204734.08315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204734.08319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204734.08322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204734.08391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204734.35957: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44071 1727204734.37874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204734.37879: stdout chunk (state=3): >>><<< 44071 1727204734.37881: stderr chunk (state=3): >>><<< 44071 1727204734.37884: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204734.37887: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204733.9767044-52624-15359219767932/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204734.37889: _low_level_execute_command(): starting 44071 1727204734.37891: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204733.9767044-52624-15359219767932/ > /dev/null 2>&1 && sleep 0' 44071 1727204734.38614: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204734.38622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204734.38636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204734.38649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204734.38683: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204734.38698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204734.38780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204734.38803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204734.38900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204734.41205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204734.41209: stdout chunk (state=3): >>><<< 44071 1727204734.41211: stderr chunk (state=3): >>><<< 44071 1727204734.41677: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204734.41683: handler run complete 44071 1727204734.41685: attempt loop complete, returning result 44071 1727204734.41688: _execute() done 44071 1727204734.41690: dumping result to json 44071 1727204734.41692: done dumping result, returning 44071 1727204734.41695: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-c964-7471-000000002338] 44071 1727204734.41697: sending task result for task 127b8e07-fff9-c964-7471-000000002338 ok: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01 skipped because already active 44071 1727204734.41908: no more pending results, returning what we have 44071 1727204734.41911: results queue empty 44071 1727204734.41912: checking for any_errors_fatal 44071 1727204734.41919: done checking for any_errors_fatal 44071 1727204734.41919: checking for max_fail_percentage 44071 1727204734.41921: done checking for max_fail_percentage 44071 1727204734.41923: checking to see if all hosts have failed and the running result is not ok 44071 1727204734.41923: done checking to see if all hosts have failed 44071 1727204734.41924: getting the remaining hosts for this loop 44071 1727204734.41926: done getting the remaining hosts for this loop 44071 1727204734.41930: getting the next task for host managed-node2 44071 1727204734.41941: done getting next task for host managed-node2 44071 1727204734.41945: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204734.41950: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204734.41964: getting variables 44071 1727204734.42140: in VariableManager get_vars() 44071 1727204734.42191: Calling all_inventory to load vars for managed-node2 44071 1727204734.42195: Calling groups_inventory to load vars for managed-node2 44071 1727204734.42197: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204734.42386: Calling all_plugins_play to load vars for managed-node2 44071 1727204734.42390: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204734.42394: Calling groups_plugins_play to load vars for managed-node2 44071 1727204734.43475: done sending task result for task 127b8e07-fff9-c964-7471-000000002338 44071 1727204734.43479: WORKER PROCESS EXITING 44071 1727204734.47361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204734.66867: done with get_vars() 44071 1727204734.66916: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:05:34 -0400 (0:00:01.146) 0:02:26.986 ***** 44071 1727204734.67044: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204734.67562: worker is 1 (out of 1 available) 44071 1727204734.67583: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204734.67712: done queuing things up, now waiting for results queue to drain 44071 1727204734.67715: waiting for pending results... 44071 1727204734.68007: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204734.68250: in run() - task 127b8e07-fff9-c964-7471-000000002339 44071 1727204734.68277: variable 'ansible_search_path' from source: unknown 44071 1727204734.68286: variable 'ansible_search_path' from source: unknown 44071 1727204734.68342: calling self._execute() 44071 1727204734.68483: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204734.68500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204734.68562: variable 'omit' from source: magic vars 44071 1727204734.69167: variable 'ansible_distribution_major_version' from source: facts 44071 1727204734.69184: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204734.69331: variable 'network_state' from source: role '' defaults 44071 1727204734.69457: Evaluated conditional (network_state != {}): False 44071 1727204734.69461: when evaluation is False, skipping this task 44071 1727204734.69469: _execute() done 44071 1727204734.69472: dumping result to json 44071 1727204734.69476: done dumping result, returning 44071 1727204734.69478: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-c964-7471-000000002339] 44071 1727204734.69482: sending task result for task 127b8e07-fff9-c964-7471-000000002339 44071 1727204734.69578: done sending task result for task 127b8e07-fff9-c964-7471-000000002339 44071 1727204734.69582: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204734.69656: no more pending results, returning what we have 44071 1727204734.69660: results queue empty 44071 1727204734.69661: checking for any_errors_fatal 44071 1727204734.69682: done checking for any_errors_fatal 44071 1727204734.69683: checking for max_fail_percentage 44071 1727204734.69685: done checking for max_fail_percentage 44071 1727204734.69686: checking to see if all hosts have failed and the running result is not ok 44071 1727204734.69687: done checking to see if all hosts have failed 44071 1727204734.69687: getting the remaining hosts for this loop 44071 1727204734.69690: done getting the remaining hosts for this loop 44071 1727204734.69695: getting the next task for host managed-node2 44071 1727204734.69705: done getting next task for host managed-node2 44071 1727204734.69710: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204734.69717: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204734.69752: getting variables 44071 1727204734.69754: in VariableManager get_vars() 44071 1727204734.69920: Calling all_inventory to load vars for managed-node2 44071 1727204734.69923: Calling groups_inventory to load vars for managed-node2 44071 1727204734.69926: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204734.69943: Calling all_plugins_play to load vars for managed-node2 44071 1727204734.69947: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204734.69951: Calling groups_plugins_play to load vars for managed-node2 44071 1727204734.72394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204734.76242: done with get_vars() 44071 1727204734.76403: done getting variables 44071 1727204734.76589: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:05:34 -0400 (0:00:00.095) 0:02:27.082 ***** 44071 1727204734.76631: entering _queue_task() for managed-node2/debug 44071 1727204734.77227: worker is 1 (out of 1 available) 44071 1727204734.77248: exiting _queue_task() for managed-node2/debug 44071 1727204734.77267: done queuing things up, now waiting for results queue to drain 44071 1727204734.77269: waiting for pending results... 44071 1727204734.77895: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204734.77901: in run() - task 127b8e07-fff9-c964-7471-00000000233a 44071 1727204734.77906: variable 'ansible_search_path' from source: unknown 44071 1727204734.77909: variable 'ansible_search_path' from source: unknown 44071 1727204734.77912: calling self._execute() 44071 1727204734.77955: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204734.77963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204734.77975: variable 'omit' from source: magic vars 44071 1727204734.78426: variable 'ansible_distribution_major_version' from source: facts 44071 1727204734.78437: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204734.78441: variable 'omit' from source: magic vars 44071 1727204734.78535: variable 'omit' from source: magic vars 44071 1727204734.78576: variable 'omit' from source: magic vars 44071 1727204734.78621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204734.78669: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204734.78751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204734.78756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204734.78760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204734.78763: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204734.78767: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204734.78772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204734.78891: Set connection var ansible_connection to ssh 44071 1727204734.78901: Set connection var ansible_timeout to 10 44071 1727204734.78904: Set connection var ansible_pipelining to False 44071 1727204734.78910: Set connection var ansible_shell_type to sh 44071 1727204734.78916: Set connection var ansible_shell_executable to /bin/sh 44071 1727204734.78969: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204734.78974: variable 'ansible_shell_executable' from source: unknown 44071 1727204734.78977: variable 'ansible_connection' from source: unknown 44071 1727204734.78980: variable 'ansible_module_compression' from source: unknown 44071 1727204734.78983: variable 'ansible_shell_type' from source: unknown 44071 1727204734.78985: variable 'ansible_shell_executable' from source: unknown 44071 1727204734.78987: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204734.78990: variable 'ansible_pipelining' from source: unknown 44071 1727204734.78993: variable 'ansible_timeout' from source: unknown 44071 1727204734.78995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204734.79140: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204734.79187: variable 'omit' from source: magic vars 44071 1727204734.79191: starting attempt loop 44071 1727204734.79194: running the handler 44071 1727204734.79336: variable '__network_connections_result' from source: set_fact 44071 1727204734.79405: handler run complete 44071 1727204734.79414: attempt loop complete, returning result 44071 1727204734.79418: _execute() done 44071 1727204734.79421: dumping result to json 44071 1727204734.79425: done dumping result, returning 44071 1727204734.79438: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-c964-7471-00000000233a] 44071 1727204734.79441: sending task result for task 127b8e07-fff9-c964-7471-00000000233a ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01 skipped because already active" ] } 44071 1727204734.79761: no more pending results, returning what we have 44071 1727204734.79764: results queue empty 44071 1727204734.79767: checking for any_errors_fatal 44071 1727204734.79772: done checking for any_errors_fatal 44071 1727204734.79773: checking for max_fail_percentage 44071 1727204734.79774: done checking for max_fail_percentage 44071 1727204734.79775: checking to see if all hosts have failed and the running result is not ok 44071 1727204734.79776: done checking to see if all hosts have failed 44071 1727204734.79776: getting the remaining hosts for this loop 44071 1727204734.79778: done getting the remaining hosts for this loop 44071 1727204734.79782: getting the next task for host managed-node2 44071 1727204734.79789: done getting next task for host managed-node2 44071 1727204734.79793: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204734.79799: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204734.79812: getting variables 44071 1727204734.79813: in VariableManager get_vars() 44071 1727204734.79986: Calling all_inventory to load vars for managed-node2 44071 1727204734.79989: Calling groups_inventory to load vars for managed-node2 44071 1727204734.79991: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204734.79999: done sending task result for task 127b8e07-fff9-c964-7471-00000000233a 44071 1727204734.80001: WORKER PROCESS EXITING 44071 1727204734.80012: Calling all_plugins_play to load vars for managed-node2 44071 1727204734.80014: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204734.80018: Calling groups_plugins_play to load vars for managed-node2 44071 1727204734.82791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204734.86191: done with get_vars() 44071 1727204734.86353: done getting variables 44071 1727204734.86424: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:05:34 -0400 (0:00:00.099) 0:02:27.182 ***** 44071 1727204734.86600: entering _queue_task() for managed-node2/debug 44071 1727204734.87484: worker is 1 (out of 1 available) 44071 1727204734.87500: exiting _queue_task() for managed-node2/debug 44071 1727204734.87516: done queuing things up, now waiting for results queue to drain 44071 1727204734.87518: waiting for pending results... 44071 1727204734.87944: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204734.88128: in run() - task 127b8e07-fff9-c964-7471-00000000233b 44071 1727204734.88149: variable 'ansible_search_path' from source: unknown 44071 1727204734.88154: variable 'ansible_search_path' from source: unknown 44071 1727204734.88199: calling self._execute() 44071 1727204734.88335: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204734.88351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204734.88430: variable 'omit' from source: magic vars 44071 1727204734.88812: variable 'ansible_distribution_major_version' from source: facts 44071 1727204734.88827: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204734.88835: variable 'omit' from source: magic vars 44071 1727204734.88915: variable 'omit' from source: magic vars 44071 1727204734.88956: variable 'omit' from source: magic vars 44071 1727204734.89009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204734.89049: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204734.89105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204734.89110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204734.89114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204734.89148: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204734.89152: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204734.89156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204734.89304: Set connection var ansible_connection to ssh 44071 1727204734.89308: Set connection var ansible_timeout to 10 44071 1727204734.89310: Set connection var ansible_pipelining to False 44071 1727204734.89313: Set connection var ansible_shell_type to sh 44071 1727204734.89315: Set connection var ansible_shell_executable to /bin/sh 44071 1727204734.89407: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204734.89413: variable 'ansible_shell_executable' from source: unknown 44071 1727204734.89417: variable 'ansible_connection' from source: unknown 44071 1727204734.89420: variable 'ansible_module_compression' from source: unknown 44071 1727204734.89423: variable 'ansible_shell_type' from source: unknown 44071 1727204734.89430: variable 'ansible_shell_executable' from source: unknown 44071 1727204734.89434: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204734.89436: variable 'ansible_pipelining' from source: unknown 44071 1727204734.89439: variable 'ansible_timeout' from source: unknown 44071 1727204734.89443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204734.89530: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204734.89560: variable 'omit' from source: magic vars 44071 1727204734.89563: starting attempt loop 44071 1727204734.89568: running the handler 44071 1727204734.89628: variable '__network_connections_result' from source: set_fact 44071 1727204734.89720: variable '__network_connections_result' from source: set_fact 44071 1727204734.89867: handler run complete 44071 1727204734.89881: attempt loop complete, returning result 44071 1727204734.89954: _execute() done 44071 1727204734.89957: dumping result to json 44071 1727204734.89960: done dumping result, returning 44071 1727204734.89963: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-c964-7471-00000000233b] 44071 1727204734.89970: sending task result for task 127b8e07-fff9-c964-7471-00000000233b 44071 1727204734.90053: done sending task result for task 127b8e07-fff9-c964-7471-00000000233b 44071 1727204734.90271: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01 skipped because already active" ] } } 44071 1727204734.90375: no more pending results, returning what we have 44071 1727204734.90379: results queue empty 44071 1727204734.90380: checking for any_errors_fatal 44071 1727204734.90386: done checking for any_errors_fatal 44071 1727204734.90387: checking for max_fail_percentage 44071 1727204734.90389: done checking for max_fail_percentage 44071 1727204734.90390: checking to see if all hosts have failed and the running result is not ok 44071 1727204734.90391: done checking to see if all hosts have failed 44071 1727204734.90392: getting the remaining hosts for this loop 44071 1727204734.90393: done getting the remaining hosts for this loop 44071 1727204734.90397: getting the next task for host managed-node2 44071 1727204734.90406: done getting next task for host managed-node2 44071 1727204734.90410: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204734.90416: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204734.90430: getting variables 44071 1727204734.90431: in VariableManager get_vars() 44071 1727204734.90642: Calling all_inventory to load vars for managed-node2 44071 1727204734.90646: Calling groups_inventory to load vars for managed-node2 44071 1727204734.90655: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204734.90672: Calling all_plugins_play to load vars for managed-node2 44071 1727204734.90676: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204734.90680: Calling groups_plugins_play to load vars for managed-node2 44071 1727204734.94397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204734.96340: done with get_vars() 44071 1727204734.96372: done getting variables 44071 1727204734.96427: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:05:34 -0400 (0:00:00.098) 0:02:27.281 ***** 44071 1727204734.96457: entering _queue_task() for managed-node2/debug 44071 1727204734.96831: worker is 1 (out of 1 available) 44071 1727204734.96848: exiting _queue_task() for managed-node2/debug 44071 1727204734.96895: done queuing things up, now waiting for results queue to drain 44071 1727204734.96898: waiting for pending results... 44071 1727204734.97159: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204734.97336: in run() - task 127b8e07-fff9-c964-7471-00000000233c 44071 1727204734.97349: variable 'ansible_search_path' from source: unknown 44071 1727204734.97355: variable 'ansible_search_path' from source: unknown 44071 1727204734.97405: calling self._execute() 44071 1727204734.97522: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204734.97528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204734.97553: variable 'omit' from source: magic vars 44071 1727204734.97958: variable 'ansible_distribution_major_version' from source: facts 44071 1727204734.97971: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204734.98141: variable 'network_state' from source: role '' defaults 44071 1727204734.98144: Evaluated conditional (network_state != {}): False 44071 1727204734.98147: when evaluation is False, skipping this task 44071 1727204734.98150: _execute() done 44071 1727204734.98153: dumping result to json 44071 1727204734.98156: done dumping result, returning 44071 1727204734.98159: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-c964-7471-00000000233c] 44071 1727204734.98161: sending task result for task 127b8e07-fff9-c964-7471-00000000233c 44071 1727204734.98451: done sending task result for task 127b8e07-fff9-c964-7471-00000000233c 44071 1727204734.98455: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 44071 1727204734.98538: no more pending results, returning what we have 44071 1727204734.98543: results queue empty 44071 1727204734.98544: checking for any_errors_fatal 44071 1727204734.98557: done checking for any_errors_fatal 44071 1727204734.98557: checking for max_fail_percentage 44071 1727204734.98561: done checking for max_fail_percentage 44071 1727204734.98562: checking to see if all hosts have failed and the running result is not ok 44071 1727204734.98563: done checking to see if all hosts have failed 44071 1727204734.98564: getting the remaining hosts for this loop 44071 1727204734.98569: done getting the remaining hosts for this loop 44071 1727204734.98575: getting the next task for host managed-node2 44071 1727204734.98585: done getting next task for host managed-node2 44071 1727204734.98591: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204734.98599: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204734.98638: getting variables 44071 1727204734.98641: in VariableManager get_vars() 44071 1727204734.98907: Calling all_inventory to load vars for managed-node2 44071 1727204734.98910: Calling groups_inventory to load vars for managed-node2 44071 1727204734.98912: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204734.98924: Calling all_plugins_play to load vars for managed-node2 44071 1727204734.98927: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204734.98930: Calling groups_plugins_play to load vars for managed-node2 44071 1727204735.00657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204735.01937: done with get_vars() 44071 1727204735.01972: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:05:35 -0400 (0:00:00.056) 0:02:27.337 ***** 44071 1727204735.02109: entering _queue_task() for managed-node2/ping 44071 1727204735.02563: worker is 1 (out of 1 available) 44071 1727204735.02583: exiting _queue_task() for managed-node2/ping 44071 1727204735.02599: done queuing things up, now waiting for results queue to drain 44071 1727204735.02601: waiting for pending results... 44071 1727204735.02969: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204735.03191: in run() - task 127b8e07-fff9-c964-7471-00000000233d 44071 1727204735.03196: variable 'ansible_search_path' from source: unknown 44071 1727204735.03199: variable 'ansible_search_path' from source: unknown 44071 1727204735.03236: calling self._execute() 44071 1727204735.03324: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204735.03332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204735.03342: variable 'omit' from source: magic vars 44071 1727204735.03694: variable 'ansible_distribution_major_version' from source: facts 44071 1727204735.03706: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204735.03712: variable 'omit' from source: magic vars 44071 1727204735.03773: variable 'omit' from source: magic vars 44071 1727204735.03801: variable 'omit' from source: magic vars 44071 1727204735.03840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204735.03877: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204735.03895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204735.03910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204735.03922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204735.03949: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204735.03952: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204735.03955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204735.04041: Set connection var ansible_connection to ssh 44071 1727204735.04047: Set connection var ansible_timeout to 10 44071 1727204735.04053: Set connection var ansible_pipelining to False 44071 1727204735.04058: Set connection var ansible_shell_type to sh 44071 1727204735.04064: Set connection var ansible_shell_executable to /bin/sh 44071 1727204735.04073: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204735.04096: variable 'ansible_shell_executable' from source: unknown 44071 1727204735.04100: variable 'ansible_connection' from source: unknown 44071 1727204735.04105: variable 'ansible_module_compression' from source: unknown 44071 1727204735.04107: variable 'ansible_shell_type' from source: unknown 44071 1727204735.04110: variable 'ansible_shell_executable' from source: unknown 44071 1727204735.04112: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204735.04114: variable 'ansible_pipelining' from source: unknown 44071 1727204735.04117: variable 'ansible_timeout' from source: unknown 44071 1727204735.04122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204735.04296: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204735.04310: variable 'omit' from source: magic vars 44071 1727204735.04313: starting attempt loop 44071 1727204735.04315: running the handler 44071 1727204735.04327: _low_level_execute_command(): starting 44071 1727204735.04336: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204735.04922: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204735.04928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204735.04935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204735.04984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204735.05002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204735.05080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204735.06833: stdout chunk (state=3): >>>/root <<< 44071 1727204735.06939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204735.07021: stderr chunk (state=3): >>><<< 44071 1727204735.07028: stdout chunk (state=3): >>><<< 44071 1727204735.07085: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204735.07090: _low_level_execute_command(): starting 44071 1727204735.07093: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204735.0706851-52680-232823542859787 `" && echo ansible-tmp-1727204735.0706851-52680-232823542859787="` echo /root/.ansible/tmp/ansible-tmp-1727204735.0706851-52680-232823542859787 `" ) && sleep 0' 44071 1727204735.07970: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204735.07994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204735.08132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204735.10104: stdout chunk (state=3): >>>ansible-tmp-1727204735.0706851-52680-232823542859787=/root/.ansible/tmp/ansible-tmp-1727204735.0706851-52680-232823542859787 <<< 44071 1727204735.10214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204735.10310: stderr chunk (state=3): >>><<< 44071 1727204735.10313: stdout chunk (state=3): >>><<< 44071 1727204735.10359: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204735.0706851-52680-232823542859787=/root/.ansible/tmp/ansible-tmp-1727204735.0706851-52680-232823542859787 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204735.10408: variable 'ansible_module_compression' from source: unknown 44071 1727204735.10447: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44071 1727204735.10497: variable 'ansible_facts' from source: unknown 44071 1727204735.10555: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204735.0706851-52680-232823542859787/AnsiballZ_ping.py 44071 1727204735.10759: Sending initial data 44071 1727204735.10763: Sent initial data (153 bytes) 44071 1727204735.11318: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204735.11323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204735.11328: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204735.11330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204735.11374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204735.11378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204735.11402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204735.11481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204735.13097: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204735.13162: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204735.13234: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpqze9_xwi /root/.ansible/tmp/ansible-tmp-1727204735.0706851-52680-232823542859787/AnsiballZ_ping.py <<< 44071 1727204735.13237: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204735.0706851-52680-232823542859787/AnsiballZ_ping.py" <<< 44071 1727204735.13303: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 44071 1727204735.13308: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpqze9_xwi" to remote "/root/.ansible/tmp/ansible-tmp-1727204735.0706851-52680-232823542859787/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204735.0706851-52680-232823542859787/AnsiballZ_ping.py" <<< 44071 1727204735.13968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204735.14046: stderr chunk (state=3): >>><<< 44071 1727204735.14050: stdout chunk (state=3): >>><<< 44071 1727204735.14073: done transferring module to remote 44071 1727204735.14086: _low_level_execute_command(): starting 44071 1727204735.14091: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204735.0706851-52680-232823542859787/ /root/.ansible/tmp/ansible-tmp-1727204735.0706851-52680-232823542859787/AnsiballZ_ping.py && sleep 0' 44071 1727204735.14877: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204735.14882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204735.14890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204735.14928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204735.15010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204735.16810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204735.16880: stderr chunk (state=3): >>><<< 44071 1727204735.16900: stdout chunk (state=3): >>><<< 44071 1727204735.16945: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204735.17045: _low_level_execute_command(): starting 44071 1727204735.17048: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204735.0706851-52680-232823542859787/AnsiballZ_ping.py && sleep 0' 44071 1727204735.17582: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204735.17586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204735.17589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204735.17624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204735.17648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204735.17652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204735.17663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204735.17749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204735.34009: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44071 1727204735.35531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204735.35540: stdout chunk (state=3): >>><<< 44071 1727204735.35543: stderr chunk (state=3): >>><<< 44071 1727204735.35545: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204735.35549: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204735.0706851-52680-232823542859787/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204735.35552: _low_level_execute_command(): starting 44071 1727204735.35554: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204735.0706851-52680-232823542859787/ > /dev/null 2>&1 && sleep 0' 44071 1727204735.36186: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204735.36190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204735.36212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204735.36318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204735.36345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204735.36454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204735.38578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204735.38582: stdout chunk (state=3): >>><<< 44071 1727204735.38584: stderr chunk (state=3): >>><<< 44071 1727204735.38587: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204735.38589: handler run complete 44071 1727204735.38590: attempt loop complete, returning result 44071 1727204735.38592: _execute() done 44071 1727204735.38594: dumping result to json 44071 1727204735.38596: done dumping result, returning 44071 1727204735.38597: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-c964-7471-00000000233d] 44071 1727204735.38599: sending task result for task 127b8e07-fff9-c964-7471-00000000233d 44071 1727204735.38673: done sending task result for task 127b8e07-fff9-c964-7471-00000000233d 44071 1727204735.38695: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 44071 1727204735.38776: no more pending results, returning what we have 44071 1727204735.38780: results queue empty 44071 1727204735.38780: checking for any_errors_fatal 44071 1727204735.38792: done checking for any_errors_fatal 44071 1727204735.38794: checking for max_fail_percentage 44071 1727204735.38795: done checking for max_fail_percentage 44071 1727204735.38796: checking to see if all hosts have failed and the running result is not ok 44071 1727204735.38797: done checking to see if all hosts have failed 44071 1727204735.38797: getting the remaining hosts for this loop 44071 1727204735.38802: done getting the remaining hosts for this loop 44071 1727204735.38807: getting the next task for host managed-node2 44071 1727204735.38819: done getting next task for host managed-node2 44071 1727204735.38821: ^ task is: TASK: meta (role_complete) 44071 1727204735.38826: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204735.38842: getting variables 44071 1727204735.38843: in VariableManager get_vars() 44071 1727204735.38986: Calling all_inventory to load vars for managed-node2 44071 1727204735.38989: Calling groups_inventory to load vars for managed-node2 44071 1727204735.38992: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204735.39003: Calling all_plugins_play to load vars for managed-node2 44071 1727204735.39006: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204735.39010: Calling groups_plugins_play to load vars for managed-node2 44071 1727204735.40918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204735.43119: done with get_vars() 44071 1727204735.43162: done getting variables 44071 1727204735.43267: done queuing things up, now waiting for results queue to drain 44071 1727204735.43270: results queue empty 44071 1727204735.43271: checking for any_errors_fatal 44071 1727204735.43275: done checking for any_errors_fatal 44071 1727204735.43276: checking for max_fail_percentage 44071 1727204735.43277: done checking for max_fail_percentage 44071 1727204735.43278: checking to see if all hosts have failed and the running result is not ok 44071 1727204735.43279: done checking to see if all hosts have failed 44071 1727204735.43280: getting the remaining hosts for this loop 44071 1727204735.43281: done getting the remaining hosts for this loop 44071 1727204735.43284: getting the next task for host managed-node2 44071 1727204735.43292: done getting next task for host managed-node2 44071 1727204735.43295: ^ task is: TASK: Include network role 44071 1727204735.43298: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204735.43301: getting variables 44071 1727204735.43303: in VariableManager get_vars() 44071 1727204735.43319: Calling all_inventory to load vars for managed-node2 44071 1727204735.43321: Calling groups_inventory to load vars for managed-node2 44071 1727204735.43324: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204735.43330: Calling all_plugins_play to load vars for managed-node2 44071 1727204735.43332: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204735.43335: Calling groups_plugins_play to load vars for managed-node2 44071 1727204735.45041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204735.47218: done with get_vars() 44071 1727204735.47255: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Tuesday 24 September 2024 15:05:35 -0400 (0:00:00.452) 0:02:27.789 ***** 44071 1727204735.47348: entering _queue_task() for managed-node2/include_role 44071 1727204735.47814: worker is 1 (out of 1 available) 44071 1727204735.47828: exiting _queue_task() for managed-node2/include_role 44071 1727204735.47843: done queuing things up, now waiting for results queue to drain 44071 1727204735.47844: waiting for pending results... 44071 1727204735.48252: running TaskExecutor() for managed-node2/TASK: Include network role 44071 1727204735.48281: in run() - task 127b8e07-fff9-c964-7471-000000002142 44071 1727204735.48299: variable 'ansible_search_path' from source: unknown 44071 1727204735.48303: variable 'ansible_search_path' from source: unknown 44071 1727204735.48350: calling self._execute() 44071 1727204735.48458: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204735.48467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204735.48478: variable 'omit' from source: magic vars 44071 1727204735.48915: variable 'ansible_distribution_major_version' from source: facts 44071 1727204735.48938: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204735.48942: _execute() done 44071 1727204735.48947: dumping result to json 44071 1727204735.48950: done dumping result, returning 44071 1727204735.48958: done running TaskExecutor() for managed-node2/TASK: Include network role [127b8e07-fff9-c964-7471-000000002142] 44071 1727204735.48963: sending task result for task 127b8e07-fff9-c964-7471-000000002142 44071 1727204735.49110: done sending task result for task 127b8e07-fff9-c964-7471-000000002142 44071 1727204735.49114: WORKER PROCESS EXITING 44071 1727204735.49149: no more pending results, returning what we have 44071 1727204735.49155: in VariableManager get_vars() 44071 1727204735.49223: Calling all_inventory to load vars for managed-node2 44071 1727204735.49227: Calling groups_inventory to load vars for managed-node2 44071 1727204735.49231: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204735.49249: Calling all_plugins_play to load vars for managed-node2 44071 1727204735.49253: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204735.49257: Calling groups_plugins_play to load vars for managed-node2 44071 1727204735.53344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204735.57117: done with get_vars() 44071 1727204735.57157: variable 'ansible_search_path' from source: unknown 44071 1727204735.57159: variable 'ansible_search_path' from source: unknown 44071 1727204735.57346: variable 'omit' from source: magic vars 44071 1727204735.57395: variable 'omit' from source: magic vars 44071 1727204735.57412: variable 'omit' from source: magic vars 44071 1727204735.57416: we have included files to process 44071 1727204735.57417: generating all_blocks data 44071 1727204735.57419: done generating all_blocks data 44071 1727204735.57426: processing included file: fedora.linux_system_roles.network 44071 1727204735.57453: in VariableManager get_vars() 44071 1727204735.57481: done with get_vars() 44071 1727204735.57517: in VariableManager get_vars() 44071 1727204735.57540: done with get_vars() 44071 1727204735.57586: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44071 1727204735.57735: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44071 1727204735.57830: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44071 1727204735.58450: in VariableManager get_vars() 44071 1727204735.58480: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204735.61098: iterating over new_blocks loaded from include file 44071 1727204735.61100: in VariableManager get_vars() 44071 1727204735.61122: done with get_vars() 44071 1727204735.61124: filtering new block on tags 44071 1727204735.61435: done filtering new block on tags 44071 1727204735.61440: in VariableManager get_vars() 44071 1727204735.61462: done with get_vars() 44071 1727204735.61463: filtering new block on tags 44071 1727204735.61491: done filtering new block on tags 44071 1727204735.61493: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 44071 1727204735.61500: extending task lists for all hosts with included blocks 44071 1727204735.61632: done extending task lists 44071 1727204735.61634: done processing included files 44071 1727204735.61635: results queue empty 44071 1727204735.61635: checking for any_errors_fatal 44071 1727204735.61637: done checking for any_errors_fatal 44071 1727204735.61638: checking for max_fail_percentage 44071 1727204735.61639: done checking for max_fail_percentage 44071 1727204735.61640: checking to see if all hosts have failed and the running result is not ok 44071 1727204735.61641: done checking to see if all hosts have failed 44071 1727204735.61642: getting the remaining hosts for this loop 44071 1727204735.61643: done getting the remaining hosts for this loop 44071 1727204735.61646: getting the next task for host managed-node2 44071 1727204735.61651: done getting next task for host managed-node2 44071 1727204735.61654: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204735.61658: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204735.61680: getting variables 44071 1727204735.61681: in VariableManager get_vars() 44071 1727204735.61703: Calling all_inventory to load vars for managed-node2 44071 1727204735.61705: Calling groups_inventory to load vars for managed-node2 44071 1727204735.61708: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204735.61715: Calling all_plugins_play to load vars for managed-node2 44071 1727204735.61717: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204735.61720: Calling groups_plugins_play to load vars for managed-node2 44071 1727204735.63394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204735.65667: done with get_vars() 44071 1727204735.65720: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:05:35 -0400 (0:00:00.184) 0:02:27.974 ***** 44071 1727204735.65824: entering _queue_task() for managed-node2/include_tasks 44071 1727204735.66490: worker is 1 (out of 1 available) 44071 1727204735.66503: exiting _queue_task() for managed-node2/include_tasks 44071 1727204735.66516: done queuing things up, now waiting for results queue to drain 44071 1727204735.66518: waiting for pending results... 44071 1727204735.66829: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204735.66846: in run() - task 127b8e07-fff9-c964-7471-0000000024a4 44071 1727204735.66920: variable 'ansible_search_path' from source: unknown 44071 1727204735.66924: variable 'ansible_search_path' from source: unknown 44071 1727204735.66927: calling self._execute() 44071 1727204735.67029: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204735.67035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204735.67042: variable 'omit' from source: magic vars 44071 1727204735.67504: variable 'ansible_distribution_major_version' from source: facts 44071 1727204735.67528: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204735.67537: _execute() done 44071 1727204735.67541: dumping result to json 44071 1727204735.67544: done dumping result, returning 44071 1727204735.67550: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-c964-7471-0000000024a4] 44071 1727204735.67556: sending task result for task 127b8e07-fff9-c964-7471-0000000024a4 44071 1727204735.67762: done sending task result for task 127b8e07-fff9-c964-7471-0000000024a4 44071 1727204735.67768: WORKER PROCESS EXITING 44071 1727204735.67839: no more pending results, returning what we have 44071 1727204735.67848: in VariableManager get_vars() 44071 1727204735.67916: Calling all_inventory to load vars for managed-node2 44071 1727204735.67920: Calling groups_inventory to load vars for managed-node2 44071 1727204735.67923: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204735.67939: Calling all_plugins_play to load vars for managed-node2 44071 1727204735.67943: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204735.67947: Calling groups_plugins_play to load vars for managed-node2 44071 1727204735.69864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204735.71127: done with get_vars() 44071 1727204735.71159: variable 'ansible_search_path' from source: unknown 44071 1727204735.71161: variable 'ansible_search_path' from source: unknown 44071 1727204735.71196: we have included files to process 44071 1727204735.71197: generating all_blocks data 44071 1727204735.71199: done generating all_blocks data 44071 1727204735.71202: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204735.71203: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204735.71205: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204735.71858: done processing included file 44071 1727204735.71861: iterating over new_blocks loaded from include file 44071 1727204735.71862: in VariableManager get_vars() 44071 1727204735.71897: done with get_vars() 44071 1727204735.71900: filtering new block on tags 44071 1727204735.71934: done filtering new block on tags 44071 1727204735.71937: in VariableManager get_vars() 44071 1727204735.71964: done with get_vars() 44071 1727204735.71968: filtering new block on tags 44071 1727204735.72019: done filtering new block on tags 44071 1727204735.72022: in VariableManager get_vars() 44071 1727204735.72049: done with get_vars() 44071 1727204735.72050: filtering new block on tags 44071 1727204735.72099: done filtering new block on tags 44071 1727204735.72101: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 44071 1727204735.72107: extending task lists for all hosts with included blocks 44071 1727204735.73510: done extending task lists 44071 1727204735.73512: done processing included files 44071 1727204735.73512: results queue empty 44071 1727204735.73513: checking for any_errors_fatal 44071 1727204735.73516: done checking for any_errors_fatal 44071 1727204735.73516: checking for max_fail_percentage 44071 1727204735.73518: done checking for max_fail_percentage 44071 1727204735.73518: checking to see if all hosts have failed and the running result is not ok 44071 1727204735.73519: done checking to see if all hosts have failed 44071 1727204735.73519: getting the remaining hosts for this loop 44071 1727204735.73520: done getting the remaining hosts for this loop 44071 1727204735.73522: getting the next task for host managed-node2 44071 1727204735.73528: done getting next task for host managed-node2 44071 1727204735.73530: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204735.73534: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204735.73547: getting variables 44071 1727204735.73548: in VariableManager get_vars() 44071 1727204735.73568: Calling all_inventory to load vars for managed-node2 44071 1727204735.73570: Calling groups_inventory to load vars for managed-node2 44071 1727204735.73571: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204735.73577: Calling all_plugins_play to load vars for managed-node2 44071 1727204735.73578: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204735.73580: Calling groups_plugins_play to load vars for managed-node2 44071 1727204735.74545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204735.76499: done with get_vars() 44071 1727204735.76539: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:05:35 -0400 (0:00:00.108) 0:02:28.082 ***** 44071 1727204735.76642: entering _queue_task() for managed-node2/setup 44071 1727204735.76980: worker is 1 (out of 1 available) 44071 1727204735.76996: exiting _queue_task() for managed-node2/setup 44071 1727204735.77012: done queuing things up, now waiting for results queue to drain 44071 1727204735.77014: waiting for pending results... 44071 1727204735.77271: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204735.77395: in run() - task 127b8e07-fff9-c964-7471-0000000024fb 44071 1727204735.77410: variable 'ansible_search_path' from source: unknown 44071 1727204735.77413: variable 'ansible_search_path' from source: unknown 44071 1727204735.77449: calling self._execute() 44071 1727204735.77536: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204735.77545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204735.77554: variable 'omit' from source: magic vars 44071 1727204735.77880: variable 'ansible_distribution_major_version' from source: facts 44071 1727204735.77892: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204735.78074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204735.80528: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204735.80533: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204735.80703: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204735.80707: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204735.80710: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204735.80713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204735.80748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204735.80971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204735.80974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204735.80976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204735.80978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204735.80980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204735.80982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204735.81019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204735.81033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204735.81213: variable '__network_required_facts' from source: role '' defaults 44071 1727204735.81217: variable 'ansible_facts' from source: unknown 44071 1727204735.82041: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44071 1727204735.82046: when evaluation is False, skipping this task 44071 1727204735.82049: _execute() done 44071 1727204735.82051: dumping result to json 44071 1727204735.82054: done dumping result, returning 44071 1727204735.82060: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-c964-7471-0000000024fb] 44071 1727204735.82064: sending task result for task 127b8e07-fff9-c964-7471-0000000024fb 44071 1727204735.82174: done sending task result for task 127b8e07-fff9-c964-7471-0000000024fb 44071 1727204735.82177: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204735.82243: no more pending results, returning what we have 44071 1727204735.82247: results queue empty 44071 1727204735.82248: checking for any_errors_fatal 44071 1727204735.82250: done checking for any_errors_fatal 44071 1727204735.82251: checking for max_fail_percentage 44071 1727204735.82252: done checking for max_fail_percentage 44071 1727204735.82253: checking to see if all hosts have failed and the running result is not ok 44071 1727204735.82254: done checking to see if all hosts have failed 44071 1727204735.82255: getting the remaining hosts for this loop 44071 1727204735.82256: done getting the remaining hosts for this loop 44071 1727204735.82262: getting the next task for host managed-node2 44071 1727204735.82276: done getting next task for host managed-node2 44071 1727204735.82279: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204735.82288: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204735.82319: getting variables 44071 1727204735.82321: in VariableManager get_vars() 44071 1727204735.82376: Calling all_inventory to load vars for managed-node2 44071 1727204735.82379: Calling groups_inventory to load vars for managed-node2 44071 1727204735.82382: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204735.82393: Calling all_plugins_play to load vars for managed-node2 44071 1727204735.82396: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204735.82405: Calling groups_plugins_play to load vars for managed-node2 44071 1727204735.83497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204735.85302: done with get_vars() 44071 1727204735.85340: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:05:35 -0400 (0:00:00.087) 0:02:28.170 ***** 44071 1727204735.85428: entering _queue_task() for managed-node2/stat 44071 1727204735.85753: worker is 1 (out of 1 available) 44071 1727204735.85773: exiting _queue_task() for managed-node2/stat 44071 1727204735.85786: done queuing things up, now waiting for results queue to drain 44071 1727204735.85788: waiting for pending results... 44071 1727204735.86003: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204735.86130: in run() - task 127b8e07-fff9-c964-7471-0000000024fd 44071 1727204735.86146: variable 'ansible_search_path' from source: unknown 44071 1727204735.86150: variable 'ansible_search_path' from source: unknown 44071 1727204735.86183: calling self._execute() 44071 1727204735.86272: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204735.86275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204735.86285: variable 'omit' from source: magic vars 44071 1727204735.86602: variable 'ansible_distribution_major_version' from source: facts 44071 1727204735.86613: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204735.86746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204735.86973: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204735.87013: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204735.87039: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204735.87064: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204735.87139: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204735.87158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204735.87179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204735.87198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204735.87275: variable '__network_is_ostree' from source: set_fact 44071 1727204735.87282: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204735.87286: when evaluation is False, skipping this task 44071 1727204735.87288: _execute() done 44071 1727204735.87291: dumping result to json 44071 1727204735.87293: done dumping result, returning 44071 1727204735.87301: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-c964-7471-0000000024fd] 44071 1727204735.87305: sending task result for task 127b8e07-fff9-c964-7471-0000000024fd 44071 1727204735.87409: done sending task result for task 127b8e07-fff9-c964-7471-0000000024fd 44071 1727204735.87412: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204735.87487: no more pending results, returning what we have 44071 1727204735.87492: results queue empty 44071 1727204735.87493: checking for any_errors_fatal 44071 1727204735.87504: done checking for any_errors_fatal 44071 1727204735.87505: checking for max_fail_percentage 44071 1727204735.87507: done checking for max_fail_percentage 44071 1727204735.87508: checking to see if all hosts have failed and the running result is not ok 44071 1727204735.87509: done checking to see if all hosts have failed 44071 1727204735.87509: getting the remaining hosts for this loop 44071 1727204735.87511: done getting the remaining hosts for this loop 44071 1727204735.87516: getting the next task for host managed-node2 44071 1727204735.87526: done getting next task for host managed-node2 44071 1727204735.87530: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204735.87538: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204735.87564: getting variables 44071 1727204735.87568: in VariableManager get_vars() 44071 1727204735.87612: Calling all_inventory to load vars for managed-node2 44071 1727204735.87616: Calling groups_inventory to load vars for managed-node2 44071 1727204735.87618: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204735.87628: Calling all_plugins_play to load vars for managed-node2 44071 1727204735.87630: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204735.87635: Calling groups_plugins_play to load vars for managed-node2 44071 1727204735.88860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204735.90100: done with get_vars() 44071 1727204735.90129: done getting variables 44071 1727204735.90188: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:05:35 -0400 (0:00:00.047) 0:02:28.218 ***** 44071 1727204735.90219: entering _queue_task() for managed-node2/set_fact 44071 1727204735.90527: worker is 1 (out of 1 available) 44071 1727204735.90545: exiting _queue_task() for managed-node2/set_fact 44071 1727204735.90562: done queuing things up, now waiting for results queue to drain 44071 1727204735.90564: waiting for pending results... 44071 1727204735.90772: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204735.90892: in run() - task 127b8e07-fff9-c964-7471-0000000024fe 44071 1727204735.90909: variable 'ansible_search_path' from source: unknown 44071 1727204735.90913: variable 'ansible_search_path' from source: unknown 44071 1727204735.90948: calling self._execute() 44071 1727204735.91040: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204735.91044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204735.91053: variable 'omit' from source: magic vars 44071 1727204735.91370: variable 'ansible_distribution_major_version' from source: facts 44071 1727204735.91383: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204735.91513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204735.91730: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204735.91766: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204735.91796: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204735.91820: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204735.91893: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204735.91913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204735.91935: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204735.91953: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204735.92025: variable '__network_is_ostree' from source: set_fact 44071 1727204735.92032: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204735.92038: when evaluation is False, skipping this task 44071 1727204735.92040: _execute() done 44071 1727204735.92043: dumping result to json 44071 1727204735.92045: done dumping result, returning 44071 1727204735.92051: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-0000000024fe] 44071 1727204735.92057: sending task result for task 127b8e07-fff9-c964-7471-0000000024fe 44071 1727204735.92160: done sending task result for task 127b8e07-fff9-c964-7471-0000000024fe 44071 1727204735.92164: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204735.92222: no more pending results, returning what we have 44071 1727204735.92227: results queue empty 44071 1727204735.92227: checking for any_errors_fatal 44071 1727204735.92237: done checking for any_errors_fatal 44071 1727204735.92238: checking for max_fail_percentage 44071 1727204735.92239: done checking for max_fail_percentage 44071 1727204735.92241: checking to see if all hosts have failed and the running result is not ok 44071 1727204735.92241: done checking to see if all hosts have failed 44071 1727204735.92242: getting the remaining hosts for this loop 44071 1727204735.92244: done getting the remaining hosts for this loop 44071 1727204735.92248: getting the next task for host managed-node2 44071 1727204735.92259: done getting next task for host managed-node2 44071 1727204735.92264: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204735.92273: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204735.92308: getting variables 44071 1727204735.92310: in VariableManager get_vars() 44071 1727204735.92358: Calling all_inventory to load vars for managed-node2 44071 1727204735.92361: Calling groups_inventory to load vars for managed-node2 44071 1727204735.92363: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204735.92375: Calling all_plugins_play to load vars for managed-node2 44071 1727204735.92377: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204735.92380: Calling groups_plugins_play to load vars for managed-node2 44071 1727204735.93574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204735.94819: done with get_vars() 44071 1727204735.94851: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:05:35 -0400 (0:00:00.047) 0:02:28.265 ***** 44071 1727204735.94939: entering _queue_task() for managed-node2/service_facts 44071 1727204735.95250: worker is 1 (out of 1 available) 44071 1727204735.95268: exiting _queue_task() for managed-node2/service_facts 44071 1727204735.95285: done queuing things up, now waiting for results queue to drain 44071 1727204735.95287: waiting for pending results... 44071 1727204735.95572: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204735.95708: in run() - task 127b8e07-fff9-c964-7471-000000002500 44071 1727204735.95726: variable 'ansible_search_path' from source: unknown 44071 1727204735.95730: variable 'ansible_search_path' from source: unknown 44071 1727204735.95770: calling self._execute() 44071 1727204735.95863: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204735.95867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204735.95879: variable 'omit' from source: magic vars 44071 1727204735.96204: variable 'ansible_distribution_major_version' from source: facts 44071 1727204735.96216: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204735.96223: variable 'omit' from source: magic vars 44071 1727204735.96285: variable 'omit' from source: magic vars 44071 1727204735.96314: variable 'omit' from source: magic vars 44071 1727204735.96352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204735.96385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204735.96404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204735.96419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204735.96432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204735.96459: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204735.96462: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204735.96467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204735.96547: Set connection var ansible_connection to ssh 44071 1727204735.96553: Set connection var ansible_timeout to 10 44071 1727204735.96559: Set connection var ansible_pipelining to False 44071 1727204735.96564: Set connection var ansible_shell_type to sh 44071 1727204735.96572: Set connection var ansible_shell_executable to /bin/sh 44071 1727204735.96579: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204735.96603: variable 'ansible_shell_executable' from source: unknown 44071 1727204735.96606: variable 'ansible_connection' from source: unknown 44071 1727204735.96609: variable 'ansible_module_compression' from source: unknown 44071 1727204735.96612: variable 'ansible_shell_type' from source: unknown 44071 1727204735.96614: variable 'ansible_shell_executable' from source: unknown 44071 1727204735.96616: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204735.96621: variable 'ansible_pipelining' from source: unknown 44071 1727204735.96624: variable 'ansible_timeout' from source: unknown 44071 1727204735.96630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204735.96805: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204735.96819: variable 'omit' from source: magic vars 44071 1727204735.96822: starting attempt loop 44071 1727204735.96824: running the handler 44071 1727204735.96841: _low_level_execute_command(): starting 44071 1727204735.96849: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204735.97414: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204735.97419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204735.97424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204735.97472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204735.97479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204735.97497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204735.97568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204735.99341: stdout chunk (state=3): >>>/root <<< 44071 1727204735.99540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204735.99544: stdout chunk (state=3): >>><<< 44071 1727204735.99547: stderr chunk (state=3): >>><<< 44071 1727204735.99678: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204735.99682: _low_level_execute_command(): starting 44071 1727204735.99686: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204735.9957588-52718-121759909302463 `" && echo ansible-tmp-1727204735.9957588-52718-121759909302463="` echo /root/.ansible/tmp/ansible-tmp-1727204735.9957588-52718-121759909302463 `" ) && sleep 0' 44071 1727204736.00248: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204736.00252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204736.00255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204736.00275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204736.00281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204736.00322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204736.00325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204736.00405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204736.02393: stdout chunk (state=3): >>>ansible-tmp-1727204735.9957588-52718-121759909302463=/root/.ansible/tmp/ansible-tmp-1727204735.9957588-52718-121759909302463 <<< 44071 1727204736.02505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204736.02570: stderr chunk (state=3): >>><<< 44071 1727204736.02574: stdout chunk (state=3): >>><<< 44071 1727204736.02594: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204735.9957588-52718-121759909302463=/root/.ansible/tmp/ansible-tmp-1727204735.9957588-52718-121759909302463 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204736.02636: variable 'ansible_module_compression' from source: unknown 44071 1727204736.02680: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44071 1727204736.02715: variable 'ansible_facts' from source: unknown 44071 1727204736.02786: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204735.9957588-52718-121759909302463/AnsiballZ_service_facts.py 44071 1727204736.03083: Sending initial data 44071 1727204736.03087: Sent initial data (162 bytes) 44071 1727204736.03663: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204736.03698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204736.03701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204736.03704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204736.03791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204736.03795: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204736.03805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204736.03808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204736.03810: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204736.03812: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204736.03845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204736.03863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204736.03968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204736.05592: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204736.05797: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204735.9957588-52718-121759909302463/AnsiballZ_service_facts.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp4ze6pv2i" to remote "/root/.ansible/tmp/ansible-tmp-1727204735.9957588-52718-121759909302463/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204735.9957588-52718-121759909302463/AnsiballZ_service_facts.py" <<< 44071 1727204736.05802: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp4ze6pv2i /root/.ansible/tmp/ansible-tmp-1727204735.9957588-52718-121759909302463/AnsiballZ_service_facts.py <<< 44071 1727204736.07238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204736.07321: stderr chunk (state=3): >>><<< 44071 1727204736.07324: stdout chunk (state=3): >>><<< 44071 1727204736.07356: done transferring module to remote 44071 1727204736.07453: _low_level_execute_command(): starting 44071 1727204736.07458: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204735.9957588-52718-121759909302463/ /root/.ansible/tmp/ansible-tmp-1727204735.9957588-52718-121759909302463/AnsiballZ_service_facts.py && sleep 0' 44071 1727204736.08137: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204736.08140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204736.08143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204736.08147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204736.08150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204736.08152: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204736.08155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204736.08157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204736.08159: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204736.08168: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204736.08170: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204736.08172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204736.08174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204736.08176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204736.08178: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204736.08180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204736.08245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204736.08257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204736.08462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204736.08569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204736.10535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204736.10607: stderr chunk (state=3): >>><<< 44071 1727204736.10621: stdout chunk (state=3): >>><<< 44071 1727204736.10702: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204736.10705: _low_level_execute_command(): starting 44071 1727204736.10708: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204735.9957588-52718-121759909302463/AnsiballZ_service_facts.py && sleep 0' 44071 1727204736.11890: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204736.11986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204736.12008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204736.12012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204736.12071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204736.12075: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204736.12085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204736.12089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204736.12092: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204736.12164: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204736.12385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204736.12629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204738.34396: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "s<<< 44071 1727204738.34421: stdout chunk (state=3): >>>topped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "stati<<< 44071 1727204738.34451: stdout chunk (state=3): >>>c", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "d<<< 44071 1727204738.34469: stdout chunk (state=3): >>>bus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", <<< 44071 1727204738.34491: stdout chunk (state=3): >>>"status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-r<<< 44071 1727204738.34500: stdout chunk (state=3): >>>oot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44071 1727204738.36118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204738.36140: stderr chunk (state=3): >>><<< 44071 1727204738.36151: stdout chunk (state=3): >>><<< 44071 1727204738.36197: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204738.36918: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204735.9957588-52718-121759909302463/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204738.36925: _low_level_execute_command(): starting 44071 1727204738.36930: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204735.9957588-52718-121759909302463/ > /dev/null 2>&1 && sleep 0' 44071 1727204738.37538: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204738.37596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204738.39613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204738.39619: stdout chunk (state=3): >>><<< 44071 1727204738.39622: stderr chunk (state=3): >>><<< 44071 1727204738.39772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204738.39776: handler run complete 44071 1727204738.39937: variable 'ansible_facts' from source: unknown 44071 1727204738.40184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204738.40891: variable 'ansible_facts' from source: unknown 44071 1727204738.41075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204738.41382: attempt loop complete, returning result 44071 1727204738.41395: _execute() done 44071 1727204738.41403: dumping result to json 44071 1727204738.41494: done dumping result, returning 44071 1727204738.41511: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-c964-7471-000000002500] 44071 1727204738.41527: sending task result for task 127b8e07-fff9-c964-7471-000000002500 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204738.43139: no more pending results, returning what we have 44071 1727204738.43143: results queue empty 44071 1727204738.43144: checking for any_errors_fatal 44071 1727204738.43150: done checking for any_errors_fatal 44071 1727204738.43151: checking for max_fail_percentage 44071 1727204738.43153: done checking for max_fail_percentage 44071 1727204738.43154: checking to see if all hosts have failed and the running result is not ok 44071 1727204738.43154: done checking to see if all hosts have failed 44071 1727204738.43155: getting the remaining hosts for this loop 44071 1727204738.43156: done getting the remaining hosts for this loop 44071 1727204738.43160: getting the next task for host managed-node2 44071 1727204738.43172: done getting next task for host managed-node2 44071 1727204738.43175: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204738.43181: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204738.43253: getting variables 44071 1727204738.43255: in VariableManager get_vars() 44071 1727204738.43298: Calling all_inventory to load vars for managed-node2 44071 1727204738.43301: Calling groups_inventory to load vars for managed-node2 44071 1727204738.43304: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204738.43315: Calling all_plugins_play to load vars for managed-node2 44071 1727204738.43318: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204738.43321: Calling groups_plugins_play to load vars for managed-node2 44071 1727204738.43918: done sending task result for task 127b8e07-fff9-c964-7471-000000002500 44071 1727204738.43923: WORKER PROCESS EXITING 44071 1727204738.45359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204738.47807: done with get_vars() 44071 1727204738.47859: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:05:38 -0400 (0:00:02.530) 0:02:30.796 ***** 44071 1727204738.47990: entering _queue_task() for managed-node2/package_facts 44071 1727204738.48668: worker is 1 (out of 1 available) 44071 1727204738.48682: exiting _queue_task() for managed-node2/package_facts 44071 1727204738.48695: done queuing things up, now waiting for results queue to drain 44071 1727204738.48697: waiting for pending results... 44071 1727204738.48874: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204738.49090: in run() - task 127b8e07-fff9-c964-7471-000000002501 44071 1727204738.49116: variable 'ansible_search_path' from source: unknown 44071 1727204738.49125: variable 'ansible_search_path' from source: unknown 44071 1727204738.49184: calling self._execute() 44071 1727204738.49311: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204738.49324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204738.49367: variable 'omit' from source: magic vars 44071 1727204738.49800: variable 'ansible_distribution_major_version' from source: facts 44071 1727204738.49827: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204738.49911: variable 'omit' from source: magic vars 44071 1727204738.49944: variable 'omit' from source: magic vars 44071 1727204738.49990: variable 'omit' from source: magic vars 44071 1727204738.50052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204738.50099: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204738.50132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204738.50164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204738.50185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204738.50221: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204738.50235: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204738.50251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204738.50383: Set connection var ansible_connection to ssh 44071 1727204738.50456: Set connection var ansible_timeout to 10 44071 1727204738.50464: Set connection var ansible_pipelining to False 44071 1727204738.50472: Set connection var ansible_shell_type to sh 44071 1727204738.50475: Set connection var ansible_shell_executable to /bin/sh 44071 1727204738.50477: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204738.50484: variable 'ansible_shell_executable' from source: unknown 44071 1727204738.50493: variable 'ansible_connection' from source: unknown 44071 1727204738.50502: variable 'ansible_module_compression' from source: unknown 44071 1727204738.50509: variable 'ansible_shell_type' from source: unknown 44071 1727204738.50517: variable 'ansible_shell_executable' from source: unknown 44071 1727204738.50524: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204738.50533: variable 'ansible_pipelining' from source: unknown 44071 1727204738.50541: variable 'ansible_timeout' from source: unknown 44071 1727204738.50550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204738.50871: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204738.50876: variable 'omit' from source: magic vars 44071 1727204738.50878: starting attempt loop 44071 1727204738.50880: running the handler 44071 1727204738.50883: _low_level_execute_command(): starting 44071 1727204738.50885: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204738.51724: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204738.51790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204738.51885: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204738.51919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204738.52023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204738.53900: stdout chunk (state=3): >>>/root <<< 44071 1727204738.54049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204738.54054: stdout chunk (state=3): >>><<< 44071 1727204738.54057: stderr chunk (state=3): >>><<< 44071 1727204738.54083: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204738.54232: _low_level_execute_command(): starting 44071 1727204738.54237: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204738.5409079-52936-222673019865003 `" && echo ansible-tmp-1727204738.5409079-52936-222673019865003="` echo /root/.ansible/tmp/ansible-tmp-1727204738.5409079-52936-222673019865003 `" ) && sleep 0' 44071 1727204738.55221: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204738.55278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204738.55315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204738.55357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204738.55538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204738.57428: stdout chunk (state=3): >>>ansible-tmp-1727204738.5409079-52936-222673019865003=/root/.ansible/tmp/ansible-tmp-1727204738.5409079-52936-222673019865003 <<< 44071 1727204738.58078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204738.58083: stdout chunk (state=3): >>><<< 44071 1727204738.58086: stderr chunk (state=3): >>><<< 44071 1727204738.58089: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204738.5409079-52936-222673019865003=/root/.ansible/tmp/ansible-tmp-1727204738.5409079-52936-222673019865003 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204738.58091: variable 'ansible_module_compression' from source: unknown 44071 1727204738.58094: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44071 1727204738.58096: variable 'ansible_facts' from source: unknown 44071 1727204738.58524: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204738.5409079-52936-222673019865003/AnsiballZ_package_facts.py 44071 1727204738.58745: Sending initial data 44071 1727204738.58761: Sent initial data (162 bytes) 44071 1727204738.59564: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204738.59638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204738.59707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204738.59764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204738.59780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204738.59882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204738.61617: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204738.61656: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204738.61748: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp5j6_yaup /root/.ansible/tmp/ansible-tmp-1727204738.5409079-52936-222673019865003/AnsiballZ_package_facts.py <<< 44071 1727204738.61752: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204738.5409079-52936-222673019865003/AnsiballZ_package_facts.py" <<< 44071 1727204738.61871: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp5j6_yaup" to remote "/root/.ansible/tmp/ansible-tmp-1727204738.5409079-52936-222673019865003/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204738.5409079-52936-222673019865003/AnsiballZ_package_facts.py" <<< 44071 1727204738.63632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204738.63895: stderr chunk (state=3): >>><<< 44071 1727204738.63902: stdout chunk (state=3): >>><<< 44071 1727204738.63904: done transferring module to remote 44071 1727204738.63906: _low_level_execute_command(): starting 44071 1727204738.63908: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204738.5409079-52936-222673019865003/ /root/.ansible/tmp/ansible-tmp-1727204738.5409079-52936-222673019865003/AnsiballZ_package_facts.py && sleep 0' 44071 1727204738.64534: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204738.64553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204738.64630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204738.64634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204738.64698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204738.64740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204738.64849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204738.66802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204738.66806: stdout chunk (state=3): >>><<< 44071 1727204738.66809: stderr chunk (state=3): >>><<< 44071 1727204738.66921: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204738.66926: _low_level_execute_command(): starting 44071 1727204738.66929: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204738.5409079-52936-222673019865003/AnsiballZ_package_facts.py && sleep 0' 44071 1727204738.67575: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204738.67594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204738.67622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204738.67886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204738.67994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204738.68123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204739.30397: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, <<< 44071 1727204739.30419: stdout chunk (state=3): >>>"arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40",<<< 44071 1727204739.30454: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "li<<< 44071 1727204739.30471: stdout chunk (state=3): >>>breport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source"<<< 44071 1727204739.30512: stdout chunk (state=3): >>>: "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_<<< 44071 1727204739.30520: stdout chunk (state=3): >>>64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": <<< 44071 1727204739.30564: stdout chunk (state=3): >>>"x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "<<< 44071 1727204739.30574: stdout chunk (state=3): >>>rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarc<<< 44071 1727204739.30578: stdout chunk (state=3): >>>h", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}]<<< 44071 1727204739.30611: stdout chunk (state=3): >>>, "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 44071 1727204739.30623: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "s<<< 44071 1727204739.30643: stdout chunk (state=3): >>>ource": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-t<<< 44071 1727204739.30649: stdout chunk (state=3): >>>ools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44071 1727204739.32485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204739.32550: stderr chunk (state=3): >>><<< 44071 1727204739.32554: stdout chunk (state=3): >>><<< 44071 1727204739.32603: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204739.41710: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204738.5409079-52936-222673019865003/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204739.41725: _low_level_execute_command(): starting 44071 1727204739.41730: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204738.5409079-52936-222673019865003/ > /dev/null 2>&1 && sleep 0' 44071 1727204739.42250: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204739.42254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204739.42257: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204739.42260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204739.42262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204739.42315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204739.42319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204739.42406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204739.44395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204739.44461: stderr chunk (state=3): >>><<< 44071 1727204739.44465: stdout chunk (state=3): >>><<< 44071 1727204739.44484: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204739.44490: handler run complete 44071 1727204739.45150: variable 'ansible_facts' from source: unknown 44071 1727204739.45496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204739.47297: variable 'ansible_facts' from source: unknown 44071 1727204739.47633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204739.48203: attempt loop complete, returning result 44071 1727204739.48220: _execute() done 44071 1727204739.48223: dumping result to json 44071 1727204739.48384: done dumping result, returning 44071 1727204739.48394: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-c964-7471-000000002501] 44071 1727204739.48397: sending task result for task 127b8e07-fff9-c964-7471-000000002501 44071 1727204739.58844: done sending task result for task 127b8e07-fff9-c964-7471-000000002501 44071 1727204739.58848: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204739.58970: no more pending results, returning what we have 44071 1727204739.58972: results queue empty 44071 1727204739.58973: checking for any_errors_fatal 44071 1727204739.58984: done checking for any_errors_fatal 44071 1727204739.58986: checking for max_fail_percentage 44071 1727204739.58987: done checking for max_fail_percentage 44071 1727204739.58988: checking to see if all hosts have failed and the running result is not ok 44071 1727204739.58989: done checking to see if all hosts have failed 44071 1727204739.58989: getting the remaining hosts for this loop 44071 1727204739.58991: done getting the remaining hosts for this loop 44071 1727204739.58994: getting the next task for host managed-node2 44071 1727204739.59001: done getting next task for host managed-node2 44071 1727204739.59003: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204739.59012: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204739.59024: getting variables 44071 1727204739.59025: in VariableManager get_vars() 44071 1727204739.59048: Calling all_inventory to load vars for managed-node2 44071 1727204739.59051: Calling groups_inventory to load vars for managed-node2 44071 1727204739.59053: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204739.59060: Calling all_plugins_play to load vars for managed-node2 44071 1727204739.59063: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204739.59067: Calling groups_plugins_play to load vars for managed-node2 44071 1727204739.60827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204739.63262: done with get_vars() 44071 1727204739.63316: done getting variables 44071 1727204739.63378: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:05:39 -0400 (0:00:01.154) 0:02:31.950 ***** 44071 1727204739.63419: entering _queue_task() for managed-node2/debug 44071 1727204739.63863: worker is 1 (out of 1 available) 44071 1727204739.63879: exiting _queue_task() for managed-node2/debug 44071 1727204739.63894: done queuing things up, now waiting for results queue to drain 44071 1727204739.63896: waiting for pending results... 44071 1727204739.64257: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204739.64518: in run() - task 127b8e07-fff9-c964-7471-0000000024a5 44071 1727204739.64522: variable 'ansible_search_path' from source: unknown 44071 1727204739.64525: variable 'ansible_search_path' from source: unknown 44071 1727204739.64625: calling self._execute() 44071 1727204739.64689: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204739.64703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204739.64731: variable 'omit' from source: magic vars 44071 1727204739.65479: variable 'ansible_distribution_major_version' from source: facts 44071 1727204739.65484: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204739.65487: variable 'omit' from source: magic vars 44071 1727204739.65490: variable 'omit' from source: magic vars 44071 1727204739.65493: variable 'network_provider' from source: set_fact 44071 1727204739.65496: variable 'omit' from source: magic vars 44071 1727204739.65510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204739.65553: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204739.65578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204739.65597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204739.65607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204739.65856: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204739.65860: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204739.65863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204739.65868: Set connection var ansible_connection to ssh 44071 1727204739.65871: Set connection var ansible_timeout to 10 44071 1727204739.65873: Set connection var ansible_pipelining to False 44071 1727204739.65876: Set connection var ansible_shell_type to sh 44071 1727204739.65878: Set connection var ansible_shell_executable to /bin/sh 44071 1727204739.65881: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204739.65884: variable 'ansible_shell_executable' from source: unknown 44071 1727204739.65887: variable 'ansible_connection' from source: unknown 44071 1727204739.65889: variable 'ansible_module_compression' from source: unknown 44071 1727204739.65891: variable 'ansible_shell_type' from source: unknown 44071 1727204739.65894: variable 'ansible_shell_executable' from source: unknown 44071 1727204739.65896: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204739.65898: variable 'ansible_pipelining' from source: unknown 44071 1727204739.65901: variable 'ansible_timeout' from source: unknown 44071 1727204739.65904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204739.65983: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204739.65996: variable 'omit' from source: magic vars 44071 1727204739.66000: starting attempt loop 44071 1727204739.66003: running the handler 44071 1727204739.66054: handler run complete 44071 1727204739.66072: attempt loop complete, returning result 44071 1727204739.66081: _execute() done 44071 1727204739.66084: dumping result to json 44071 1727204739.66087: done dumping result, returning 44071 1727204739.66090: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-c964-7471-0000000024a5] 44071 1727204739.66095: sending task result for task 127b8e07-fff9-c964-7471-0000000024a5 44071 1727204739.66458: done sending task result for task 127b8e07-fff9-c964-7471-0000000024a5 44071 1727204739.66461: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 44071 1727204739.66531: no more pending results, returning what we have 44071 1727204739.66541: results queue empty 44071 1727204739.66542: checking for any_errors_fatal 44071 1727204739.66551: done checking for any_errors_fatal 44071 1727204739.66552: checking for max_fail_percentage 44071 1727204739.66553: done checking for max_fail_percentage 44071 1727204739.66554: checking to see if all hosts have failed and the running result is not ok 44071 1727204739.66555: done checking to see if all hosts have failed 44071 1727204739.66556: getting the remaining hosts for this loop 44071 1727204739.66557: done getting the remaining hosts for this loop 44071 1727204739.66563: getting the next task for host managed-node2 44071 1727204739.66574: done getting next task for host managed-node2 44071 1727204739.66578: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204739.66584: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204739.66598: getting variables 44071 1727204739.66600: in VariableManager get_vars() 44071 1727204739.66649: Calling all_inventory to load vars for managed-node2 44071 1727204739.66653: Calling groups_inventory to load vars for managed-node2 44071 1727204739.66655: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204739.66670: Calling all_plugins_play to load vars for managed-node2 44071 1727204739.66673: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204739.66676: Calling groups_plugins_play to load vars for managed-node2 44071 1727204739.68697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204739.71083: done with get_vars() 44071 1727204739.71129: done getting variables 44071 1727204739.71206: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:05:39 -0400 (0:00:00.078) 0:02:32.029 ***** 44071 1727204739.71255: entering _queue_task() for managed-node2/fail 44071 1727204739.71683: worker is 1 (out of 1 available) 44071 1727204739.71699: exiting _queue_task() for managed-node2/fail 44071 1727204739.71714: done queuing things up, now waiting for results queue to drain 44071 1727204739.71715: waiting for pending results... 44071 1727204739.72070: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204739.72279: in run() - task 127b8e07-fff9-c964-7471-0000000024a6 44071 1727204739.72310: variable 'ansible_search_path' from source: unknown 44071 1727204739.72319: variable 'ansible_search_path' from source: unknown 44071 1727204739.72370: calling self._execute() 44071 1727204739.72496: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204739.72510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204739.72531: variable 'omit' from source: magic vars 44071 1727204739.72998: variable 'ansible_distribution_major_version' from source: facts 44071 1727204739.73070: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204739.73178: variable 'network_state' from source: role '' defaults 44071 1727204739.73196: Evaluated conditional (network_state != {}): False 44071 1727204739.73204: when evaluation is False, skipping this task 44071 1727204739.73212: _execute() done 44071 1727204739.73221: dumping result to json 44071 1727204739.73229: done dumping result, returning 44071 1727204739.73242: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-c964-7471-0000000024a6] 44071 1727204739.73285: sending task result for task 127b8e07-fff9-c964-7471-0000000024a6 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204739.73532: no more pending results, returning what we have 44071 1727204739.73537: results queue empty 44071 1727204739.73538: checking for any_errors_fatal 44071 1727204739.73549: done checking for any_errors_fatal 44071 1727204739.73550: checking for max_fail_percentage 44071 1727204739.73552: done checking for max_fail_percentage 44071 1727204739.73553: checking to see if all hosts have failed and the running result is not ok 44071 1727204739.73554: done checking to see if all hosts have failed 44071 1727204739.73555: getting the remaining hosts for this loop 44071 1727204739.73557: done getting the remaining hosts for this loop 44071 1727204739.73562: getting the next task for host managed-node2 44071 1727204739.73579: done getting next task for host managed-node2 44071 1727204739.73584: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204739.73592: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204739.73630: getting variables 44071 1727204739.73632: in VariableManager get_vars() 44071 1727204739.73812: Calling all_inventory to load vars for managed-node2 44071 1727204739.73816: Calling groups_inventory to load vars for managed-node2 44071 1727204739.73818: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204739.73880: Calling all_plugins_play to load vars for managed-node2 44071 1727204739.73884: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204739.73887: Calling groups_plugins_play to load vars for managed-node2 44071 1727204739.73905: done sending task result for task 127b8e07-fff9-c964-7471-0000000024a6 44071 1727204739.73908: WORKER PROCESS EXITING 44071 1727204739.77182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204739.81371: done with get_vars() 44071 1727204739.81424: done getting variables 44071 1727204739.81502: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:05:39 -0400 (0:00:00.102) 0:02:32.131 ***** 44071 1727204739.81546: entering _queue_task() for managed-node2/fail 44071 1727204739.82105: worker is 1 (out of 1 available) 44071 1727204739.82120: exiting _queue_task() for managed-node2/fail 44071 1727204739.82134: done queuing things up, now waiting for results queue to drain 44071 1727204739.82136: waiting for pending results... 44071 1727204739.82409: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204739.82621: in run() - task 127b8e07-fff9-c964-7471-0000000024a7 44071 1727204739.82652: variable 'ansible_search_path' from source: unknown 44071 1727204739.82661: variable 'ansible_search_path' from source: unknown 44071 1727204739.82713: calling self._execute() 44071 1727204739.82859: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204739.82868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204739.82884: variable 'omit' from source: magic vars 44071 1727204739.83376: variable 'ansible_distribution_major_version' from source: facts 44071 1727204739.83404: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204739.83554: variable 'network_state' from source: role '' defaults 44071 1727204739.83574: Evaluated conditional (network_state != {}): False 44071 1727204739.83624: when evaluation is False, skipping this task 44071 1727204739.83629: _execute() done 44071 1727204739.83632: dumping result to json 44071 1727204739.83634: done dumping result, returning 44071 1727204739.83637: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-c964-7471-0000000024a7] 44071 1727204739.83639: sending task result for task 127b8e07-fff9-c964-7471-0000000024a7 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204739.83986: no more pending results, returning what we have 44071 1727204739.83990: results queue empty 44071 1727204739.83991: checking for any_errors_fatal 44071 1727204739.84002: done checking for any_errors_fatal 44071 1727204739.84003: checking for max_fail_percentage 44071 1727204739.84005: done checking for max_fail_percentage 44071 1727204739.84006: checking to see if all hosts have failed and the running result is not ok 44071 1727204739.84007: done checking to see if all hosts have failed 44071 1727204739.84008: getting the remaining hosts for this loop 44071 1727204739.84010: done getting the remaining hosts for this loop 44071 1727204739.84016: getting the next task for host managed-node2 44071 1727204739.84028: done getting next task for host managed-node2 44071 1727204739.84033: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204739.84041: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204739.84189: getting variables 44071 1727204739.84191: in VariableManager get_vars() 44071 1727204739.84246: Calling all_inventory to load vars for managed-node2 44071 1727204739.84249: Calling groups_inventory to load vars for managed-node2 44071 1727204739.84251: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204739.84302: Calling all_plugins_play to load vars for managed-node2 44071 1727204739.84306: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204739.84309: Calling groups_plugins_play to load vars for managed-node2 44071 1727204739.84986: done sending task result for task 127b8e07-fff9-c964-7471-0000000024a7 44071 1727204739.84992: WORKER PROCESS EXITING 44071 1727204739.86646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204739.89247: done with get_vars() 44071 1727204739.89299: done getting variables 44071 1727204739.89373: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:05:39 -0400 (0:00:00.078) 0:02:32.210 ***** 44071 1727204739.89414: entering _queue_task() for managed-node2/fail 44071 1727204739.89849: worker is 1 (out of 1 available) 44071 1727204739.89866: exiting _queue_task() for managed-node2/fail 44071 1727204739.89883: done queuing things up, now waiting for results queue to drain 44071 1727204739.89885: waiting for pending results... 44071 1727204739.90214: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204739.90388: in run() - task 127b8e07-fff9-c964-7471-0000000024a8 44071 1727204739.90467: variable 'ansible_search_path' from source: unknown 44071 1727204739.90472: variable 'ansible_search_path' from source: unknown 44071 1727204739.90476: calling self._execute() 44071 1727204739.90578: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204739.90584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204739.90593: variable 'omit' from source: magic vars 44071 1727204739.91046: variable 'ansible_distribution_major_version' from source: facts 44071 1727204739.91059: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204739.91338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204739.93885: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204739.93975: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204739.94062: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204739.94068: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204739.94071: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204739.94163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204739.94271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204739.94275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204739.94287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204739.94302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204739.94424: variable 'ansible_distribution_major_version' from source: facts 44071 1727204739.94445: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44071 1727204739.94584: variable 'ansible_distribution' from source: facts 44071 1727204739.94588: variable '__network_rh_distros' from source: role '' defaults 44071 1727204739.94599: Evaluated conditional (ansible_distribution in __network_rh_distros): False 44071 1727204739.94602: when evaluation is False, skipping this task 44071 1727204739.94605: _execute() done 44071 1727204739.94607: dumping result to json 44071 1727204739.94624: done dumping result, returning 44071 1727204739.94627: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-c964-7471-0000000024a8] 44071 1727204739.94630: sending task result for task 127b8e07-fff9-c964-7471-0000000024a8 44071 1727204739.94995: done sending task result for task 127b8e07-fff9-c964-7471-0000000024a8 44071 1727204739.94998: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 44071 1727204739.95043: no more pending results, returning what we have 44071 1727204739.95046: results queue empty 44071 1727204739.95048: checking for any_errors_fatal 44071 1727204739.95053: done checking for any_errors_fatal 44071 1727204739.95053: checking for max_fail_percentage 44071 1727204739.95055: done checking for max_fail_percentage 44071 1727204739.95056: checking to see if all hosts have failed and the running result is not ok 44071 1727204739.95057: done checking to see if all hosts have failed 44071 1727204739.95057: getting the remaining hosts for this loop 44071 1727204739.95058: done getting the remaining hosts for this loop 44071 1727204739.95062: getting the next task for host managed-node2 44071 1727204739.95072: done getting next task for host managed-node2 44071 1727204739.95076: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204739.95082: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204739.95113: getting variables 44071 1727204739.95115: in VariableManager get_vars() 44071 1727204739.95160: Calling all_inventory to load vars for managed-node2 44071 1727204739.95163: Calling groups_inventory to load vars for managed-node2 44071 1727204739.95169: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204739.95180: Calling all_plugins_play to load vars for managed-node2 44071 1727204739.95183: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204739.95186: Calling groups_plugins_play to load vars for managed-node2 44071 1727204739.97197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204739.99569: done with get_vars() 44071 1727204739.99620: done getting variables 44071 1727204739.99689: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:05:39 -0400 (0:00:00.103) 0:02:32.313 ***** 44071 1727204739.99725: entering _queue_task() for managed-node2/dnf 44071 1727204740.00292: worker is 1 (out of 1 available) 44071 1727204740.00310: exiting _queue_task() for managed-node2/dnf 44071 1727204740.00325: done queuing things up, now waiting for results queue to drain 44071 1727204740.00327: waiting for pending results... 44071 1727204740.00678: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204740.00892: in run() - task 127b8e07-fff9-c964-7471-0000000024a9 44071 1727204740.00918: variable 'ansible_search_path' from source: unknown 44071 1727204740.00927: variable 'ansible_search_path' from source: unknown 44071 1727204740.00983: calling self._execute() 44071 1727204740.01287: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204740.01292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204740.01295: variable 'omit' from source: magic vars 44071 1727204740.01687: variable 'ansible_distribution_major_version' from source: facts 44071 1727204740.01712: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204740.01950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204740.04707: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204740.04798: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204740.04847: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204740.04895: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204740.04966: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204740.05024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.05080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.05115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.05164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.05193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.05372: variable 'ansible_distribution' from source: facts 44071 1727204740.05375: variable 'ansible_distribution_major_version' from source: facts 44071 1727204740.05378: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44071 1727204740.05501: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204740.05661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.05728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.05731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.05780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.05801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.05859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.05892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.05945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.05979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.05998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.06054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.06163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.06168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.06172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.06181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.06376: variable 'network_connections' from source: include params 44071 1727204740.06399: variable 'interface' from source: play vars 44071 1727204740.06478: variable 'interface' from source: play vars 44071 1727204740.06571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204740.06776: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204740.06828: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204740.06930: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204740.06933: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204740.06958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204740.06989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204740.07032: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.07270: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204740.07274: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204740.07430: variable 'network_connections' from source: include params 44071 1727204740.07442: variable 'interface' from source: play vars 44071 1727204740.07518: variable 'interface' from source: play vars 44071 1727204740.07547: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204740.07556: when evaluation is False, skipping this task 44071 1727204740.07562: _execute() done 44071 1727204740.07572: dumping result to json 44071 1727204740.07579: done dumping result, returning 44071 1727204740.07591: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-0000000024a9] 44071 1727204740.07603: sending task result for task 127b8e07-fff9-c964-7471-0000000024a9 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204740.07787: no more pending results, returning what we have 44071 1727204740.07790: results queue empty 44071 1727204740.07791: checking for any_errors_fatal 44071 1727204740.07798: done checking for any_errors_fatal 44071 1727204740.07799: checking for max_fail_percentage 44071 1727204740.07801: done checking for max_fail_percentage 44071 1727204740.07802: checking to see if all hosts have failed and the running result is not ok 44071 1727204740.07803: done checking to see if all hosts have failed 44071 1727204740.07803: getting the remaining hosts for this loop 44071 1727204740.07805: done getting the remaining hosts for this loop 44071 1727204740.07810: getting the next task for host managed-node2 44071 1727204740.07818: done getting next task for host managed-node2 44071 1727204740.07823: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204740.07828: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204740.07870: getting variables 44071 1727204740.07872: in VariableManager get_vars() 44071 1727204740.07922: Calling all_inventory to load vars for managed-node2 44071 1727204740.07924: Calling groups_inventory to load vars for managed-node2 44071 1727204740.07927: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204740.07941: Calling all_plugins_play to load vars for managed-node2 44071 1727204740.07944: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204740.07948: Calling groups_plugins_play to load vars for managed-node2 44071 1727204740.08476: done sending task result for task 127b8e07-fff9-c964-7471-0000000024a9 44071 1727204740.08481: WORKER PROCESS EXITING 44071 1727204740.10005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204740.12474: done with get_vars() 44071 1727204740.12523: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204740.12617: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:05:40 -0400 (0:00:00.129) 0:02:32.443 ***** 44071 1727204740.12663: entering _queue_task() for managed-node2/yum 44071 1727204740.13119: worker is 1 (out of 1 available) 44071 1727204740.13136: exiting _queue_task() for managed-node2/yum 44071 1727204740.13150: done queuing things up, now waiting for results queue to drain 44071 1727204740.13152: waiting for pending results... 44071 1727204740.13539: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204740.13729: in run() - task 127b8e07-fff9-c964-7471-0000000024aa 44071 1727204740.13749: variable 'ansible_search_path' from source: unknown 44071 1727204740.13754: variable 'ansible_search_path' from source: unknown 44071 1727204740.13798: calling self._execute() 44071 1727204740.13927: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204740.13935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204740.13949: variable 'omit' from source: magic vars 44071 1727204740.14416: variable 'ansible_distribution_major_version' from source: facts 44071 1727204740.14434: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204740.14656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204740.20461: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204740.20664: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204740.20708: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204740.20759: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204740.20892: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204740.21172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.21175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.21178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.21322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.21340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.21570: variable 'ansible_distribution_major_version' from source: facts 44071 1727204740.21651: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44071 1727204740.21655: when evaluation is False, skipping this task 44071 1727204740.21658: _execute() done 44071 1727204740.21661: dumping result to json 44071 1727204740.21663: done dumping result, returning 44071 1727204740.21675: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-0000000024aa] 44071 1727204740.21678: sending task result for task 127b8e07-fff9-c964-7471-0000000024aa 44071 1727204740.21805: done sending task result for task 127b8e07-fff9-c964-7471-0000000024aa 44071 1727204740.21809: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44071 1727204740.21875: no more pending results, returning what we have 44071 1727204740.21879: results queue empty 44071 1727204740.21880: checking for any_errors_fatal 44071 1727204740.21889: done checking for any_errors_fatal 44071 1727204740.21890: checking for max_fail_percentage 44071 1727204740.21892: done checking for max_fail_percentage 44071 1727204740.21893: checking to see if all hosts have failed and the running result is not ok 44071 1727204740.21894: done checking to see if all hosts have failed 44071 1727204740.21894: getting the remaining hosts for this loop 44071 1727204740.21896: done getting the remaining hosts for this loop 44071 1727204740.21902: getting the next task for host managed-node2 44071 1727204740.21911: done getting next task for host managed-node2 44071 1727204740.21917: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204740.21922: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204740.21959: getting variables 44071 1727204740.21961: in VariableManager get_vars() 44071 1727204740.22223: Calling all_inventory to load vars for managed-node2 44071 1727204740.22226: Calling groups_inventory to load vars for managed-node2 44071 1727204740.22229: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204740.22244: Calling all_plugins_play to load vars for managed-node2 44071 1727204740.22248: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204740.22251: Calling groups_plugins_play to load vars for managed-node2 44071 1727204740.25803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204740.29123: done with get_vars() 44071 1727204740.29175: done getting variables 44071 1727204740.29259: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:05:40 -0400 (0:00:00.166) 0:02:32.609 ***** 44071 1727204740.29303: entering _queue_task() for managed-node2/fail 44071 1727204740.29744: worker is 1 (out of 1 available) 44071 1727204740.29763: exiting _queue_task() for managed-node2/fail 44071 1727204740.29782: done queuing things up, now waiting for results queue to drain 44071 1727204740.29784: waiting for pending results... 44071 1727204740.30300: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204740.30307: in run() - task 127b8e07-fff9-c964-7471-0000000024ab 44071 1727204740.30310: variable 'ansible_search_path' from source: unknown 44071 1727204740.30312: variable 'ansible_search_path' from source: unknown 44071 1727204740.30373: calling self._execute() 44071 1727204740.30460: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204740.30464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204740.30501: variable 'omit' from source: magic vars 44071 1727204740.30927: variable 'ansible_distribution_major_version' from source: facts 44071 1727204740.31070: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204740.31074: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204740.31411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204740.34768: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204740.34870: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204740.34974: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204740.34979: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204740.34982: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204740.35048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.35100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.35124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.35171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.35191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.35242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.35268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.35315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.35334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.35414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.35448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.35637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.35735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.35739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.35744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.36275: variable 'network_connections' from source: include params 44071 1727204740.36280: variable 'interface' from source: play vars 44071 1727204740.36410: variable 'interface' from source: play vars 44071 1727204740.36503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204740.36993: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204740.37042: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204740.37277: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204740.37310: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204740.37369: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204740.37393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204740.37419: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.37449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204740.37711: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204740.38213: variable 'network_connections' from source: include params 44071 1727204740.38219: variable 'interface' from source: play vars 44071 1727204740.38506: variable 'interface' from source: play vars 44071 1727204740.38535: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204740.38541: when evaluation is False, skipping this task 44071 1727204740.38544: _execute() done 44071 1727204740.38549: dumping result to json 44071 1727204740.38551: done dumping result, returning 44071 1727204740.38564: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-0000000024ab] 44071 1727204740.38572: sending task result for task 127b8e07-fff9-c964-7471-0000000024ab skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204740.38869: no more pending results, returning what we have 44071 1727204740.38873: results queue empty 44071 1727204740.38874: checking for any_errors_fatal 44071 1727204740.38883: done checking for any_errors_fatal 44071 1727204740.38884: checking for max_fail_percentage 44071 1727204740.38885: done checking for max_fail_percentage 44071 1727204740.38886: checking to see if all hosts have failed and the running result is not ok 44071 1727204740.38887: done checking to see if all hosts have failed 44071 1727204740.38888: getting the remaining hosts for this loop 44071 1727204740.38889: done getting the remaining hosts for this loop 44071 1727204740.38895: getting the next task for host managed-node2 44071 1727204740.38903: done getting next task for host managed-node2 44071 1727204740.38908: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44071 1727204740.38918: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204740.38952: getting variables 44071 1727204740.38954: in VariableManager get_vars() 44071 1727204740.39069: Calling all_inventory to load vars for managed-node2 44071 1727204740.39073: Calling groups_inventory to load vars for managed-node2 44071 1727204740.39076: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204740.39088: Calling all_plugins_play to load vars for managed-node2 44071 1727204740.39091: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204740.39095: Calling groups_plugins_play to load vars for managed-node2 44071 1727204740.39987: done sending task result for task 127b8e07-fff9-c964-7471-0000000024ab 44071 1727204740.39992: WORKER PROCESS EXITING 44071 1727204740.43523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204740.46135: done with get_vars() 44071 1727204740.46185: done getting variables 44071 1727204740.46264: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:05:40 -0400 (0:00:00.170) 0:02:32.779 ***** 44071 1727204740.46308: entering _queue_task() for managed-node2/package 44071 1727204740.46934: worker is 1 (out of 1 available) 44071 1727204740.46949: exiting _queue_task() for managed-node2/package 44071 1727204740.46963: done queuing things up, now waiting for results queue to drain 44071 1727204740.46965: waiting for pending results... 44071 1727204740.47257: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 44071 1727204740.47467: in run() - task 127b8e07-fff9-c964-7471-0000000024ac 44071 1727204740.47504: variable 'ansible_search_path' from source: unknown 44071 1727204740.47513: variable 'ansible_search_path' from source: unknown 44071 1727204740.47567: calling self._execute() 44071 1727204740.47711: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204740.47727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204740.47741: variable 'omit' from source: magic vars 44071 1727204740.48226: variable 'ansible_distribution_major_version' from source: facts 44071 1727204740.48256: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204740.48509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204740.48885: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204740.49030: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204740.49075: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204740.49209: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204740.49379: variable 'network_packages' from source: role '' defaults 44071 1727204740.49520: variable '__network_provider_setup' from source: role '' defaults 44071 1727204740.49542: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204740.49632: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204740.49648: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204740.49726: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204740.49962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204740.53527: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204740.53614: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204740.53736: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204740.53741: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204740.53749: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204740.53855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.53897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.53952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.53992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.54012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.54168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.54172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.54176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.54195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.54215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.54512: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204740.54647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.54730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.54735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.54768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.54839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.54904: variable 'ansible_python' from source: facts 44071 1727204740.54946: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204740.55044: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204740.55162: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204740.55395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.55398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.55402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.55404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.55520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.55523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.55535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.55538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.55555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.55576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.55975: variable 'network_connections' from source: include params 44071 1727204740.55978: variable 'interface' from source: play vars 44071 1727204740.55981: variable 'interface' from source: play vars 44071 1727204740.55984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204740.55986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204740.56039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.56084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204740.56104: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204740.56433: variable 'network_connections' from source: include params 44071 1727204740.56439: variable 'interface' from source: play vars 44071 1727204740.56569: variable 'interface' from source: play vars 44071 1727204740.56590: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204740.56686: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204740.57033: variable 'network_connections' from source: include params 44071 1727204740.57042: variable 'interface' from source: play vars 44071 1727204740.57111: variable 'interface' from source: play vars 44071 1727204740.57140: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204740.57222: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204740.57561: variable 'network_connections' from source: include params 44071 1727204740.57566: variable 'interface' from source: play vars 44071 1727204740.57634: variable 'interface' from source: play vars 44071 1727204740.57695: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204740.57758: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204740.57766: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204740.57826: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204740.58071: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204740.58800: variable 'network_connections' from source: include params 44071 1727204740.58808: variable 'interface' from source: play vars 44071 1727204740.58888: variable 'interface' from source: play vars 44071 1727204740.58901: variable 'ansible_distribution' from source: facts 44071 1727204740.58904: variable '__network_rh_distros' from source: role '' defaults 44071 1727204740.58951: variable 'ansible_distribution_major_version' from source: facts 44071 1727204740.58954: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204740.59166: variable 'ansible_distribution' from source: facts 44071 1727204740.59171: variable '__network_rh_distros' from source: role '' defaults 44071 1727204740.59174: variable 'ansible_distribution_major_version' from source: facts 44071 1727204740.59177: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204740.59316: variable 'ansible_distribution' from source: facts 44071 1727204740.59319: variable '__network_rh_distros' from source: role '' defaults 44071 1727204740.59325: variable 'ansible_distribution_major_version' from source: facts 44071 1727204740.59370: variable 'network_provider' from source: set_fact 44071 1727204740.59390: variable 'ansible_facts' from source: unknown 44071 1727204740.60423: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44071 1727204740.60428: when evaluation is False, skipping this task 44071 1727204740.60431: _execute() done 44071 1727204740.60433: dumping result to json 44071 1727204740.60435: done dumping result, returning 44071 1727204740.60497: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-c964-7471-0000000024ac] 44071 1727204740.60502: sending task result for task 127b8e07-fff9-c964-7471-0000000024ac skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44071 1727204740.60850: no more pending results, returning what we have 44071 1727204740.60854: results queue empty 44071 1727204740.60854: checking for any_errors_fatal 44071 1727204740.60860: done checking for any_errors_fatal 44071 1727204740.60861: checking for max_fail_percentage 44071 1727204740.60863: done checking for max_fail_percentage 44071 1727204740.60864: checking to see if all hosts have failed and the running result is not ok 44071 1727204740.60864: done checking to see if all hosts have failed 44071 1727204740.60867: getting the remaining hosts for this loop 44071 1727204740.60868: done getting the remaining hosts for this loop 44071 1727204740.60872: getting the next task for host managed-node2 44071 1727204740.60880: done getting next task for host managed-node2 44071 1727204740.60885: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204740.60890: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204740.60942: getting variables 44071 1727204740.60945: in VariableManager get_vars() 44071 1727204740.61054: Calling all_inventory to load vars for managed-node2 44071 1727204740.61057: Calling groups_inventory to load vars for managed-node2 44071 1727204740.61091: done sending task result for task 127b8e07-fff9-c964-7471-0000000024ac 44071 1727204740.61105: WORKER PROCESS EXITING 44071 1727204740.61100: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204740.61117: Calling all_plugins_play to load vars for managed-node2 44071 1727204740.61120: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204740.61123: Calling groups_plugins_play to load vars for managed-node2 44071 1727204740.63308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204740.66006: done with get_vars() 44071 1727204740.66062: done getting variables 44071 1727204740.66136: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:05:40 -0400 (0:00:00.198) 0:02:32.978 ***** 44071 1727204740.66181: entering _queue_task() for managed-node2/package 44071 1727204740.66834: worker is 1 (out of 1 available) 44071 1727204740.66848: exiting _queue_task() for managed-node2/package 44071 1727204740.66861: done queuing things up, now waiting for results queue to drain 44071 1727204740.66863: waiting for pending results... 44071 1727204740.67155: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204740.67427: in run() - task 127b8e07-fff9-c964-7471-0000000024ad 44071 1727204740.67431: variable 'ansible_search_path' from source: unknown 44071 1727204740.67434: variable 'ansible_search_path' from source: unknown 44071 1727204740.67471: calling self._execute() 44071 1727204740.67598: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204740.67612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204740.67642: variable 'omit' from source: magic vars 44071 1727204740.68132: variable 'ansible_distribution_major_version' from source: facts 44071 1727204740.68184: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204740.68307: variable 'network_state' from source: role '' defaults 44071 1727204740.68332: Evaluated conditional (network_state != {}): False 44071 1727204740.68342: when evaluation is False, skipping this task 44071 1727204740.68401: _execute() done 44071 1727204740.68405: dumping result to json 44071 1727204740.68408: done dumping result, returning 44071 1727204740.68410: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-c964-7471-0000000024ad] 44071 1727204740.68413: sending task result for task 127b8e07-fff9-c964-7471-0000000024ad skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204740.68588: no more pending results, returning what we have 44071 1727204740.68592: results queue empty 44071 1727204740.68594: checking for any_errors_fatal 44071 1727204740.68604: done checking for any_errors_fatal 44071 1727204740.68605: checking for max_fail_percentage 44071 1727204740.68606: done checking for max_fail_percentage 44071 1727204740.68607: checking to see if all hosts have failed and the running result is not ok 44071 1727204740.68608: done checking to see if all hosts have failed 44071 1727204740.68609: getting the remaining hosts for this loop 44071 1727204740.68611: done getting the remaining hosts for this loop 44071 1727204740.68617: getting the next task for host managed-node2 44071 1727204740.68627: done getting next task for host managed-node2 44071 1727204740.68633: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204740.68642: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204740.68684: getting variables 44071 1727204740.68686: in VariableManager get_vars() 44071 1727204740.68742: Calling all_inventory to load vars for managed-node2 44071 1727204740.68745: Calling groups_inventory to load vars for managed-node2 44071 1727204740.68748: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204740.68763: Calling all_plugins_play to load vars for managed-node2 44071 1727204740.68988: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204740.68996: done sending task result for task 127b8e07-fff9-c964-7471-0000000024ad 44071 1727204740.68999: WORKER PROCESS EXITING 44071 1727204740.69003: Calling groups_plugins_play to load vars for managed-node2 44071 1727204740.71351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204740.73907: done with get_vars() 44071 1727204740.73963: done getting variables 44071 1727204740.74044: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:05:40 -0400 (0:00:00.079) 0:02:33.057 ***** 44071 1727204740.74087: entering _queue_task() for managed-node2/package 44071 1727204740.74591: worker is 1 (out of 1 available) 44071 1727204740.74606: exiting _queue_task() for managed-node2/package 44071 1727204740.74620: done queuing things up, now waiting for results queue to drain 44071 1727204740.74622: waiting for pending results... 44071 1727204740.74934: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204740.75129: in run() - task 127b8e07-fff9-c964-7471-0000000024ae 44071 1727204740.75154: variable 'ansible_search_path' from source: unknown 44071 1727204740.75161: variable 'ansible_search_path' from source: unknown 44071 1727204740.75218: calling self._execute() 44071 1727204740.75347: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204740.75360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204740.75376: variable 'omit' from source: magic vars 44071 1727204740.75845: variable 'ansible_distribution_major_version' from source: facts 44071 1727204740.75879: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204740.76076: variable 'network_state' from source: role '' defaults 44071 1727204740.76083: Evaluated conditional (network_state != {}): False 44071 1727204740.76086: when evaluation is False, skipping this task 44071 1727204740.76089: _execute() done 44071 1727204740.76092: dumping result to json 44071 1727204740.76094: done dumping result, returning 44071 1727204740.76097: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-c964-7471-0000000024ae] 44071 1727204740.76099: sending task result for task 127b8e07-fff9-c964-7471-0000000024ae 44071 1727204740.76475: done sending task result for task 127b8e07-fff9-c964-7471-0000000024ae 44071 1727204740.76479: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204740.76538: no more pending results, returning what we have 44071 1727204740.76542: results queue empty 44071 1727204740.76543: checking for any_errors_fatal 44071 1727204740.76553: done checking for any_errors_fatal 44071 1727204740.76554: checking for max_fail_percentage 44071 1727204740.76556: done checking for max_fail_percentage 44071 1727204740.76557: checking to see if all hosts have failed and the running result is not ok 44071 1727204740.76558: done checking to see if all hosts have failed 44071 1727204740.76559: getting the remaining hosts for this loop 44071 1727204740.76561: done getting the remaining hosts for this loop 44071 1727204740.76568: getting the next task for host managed-node2 44071 1727204740.76578: done getting next task for host managed-node2 44071 1727204740.76583: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204740.76595: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204740.76630: getting variables 44071 1727204740.76632: in VariableManager get_vars() 44071 1727204740.76797: Calling all_inventory to load vars for managed-node2 44071 1727204740.76800: Calling groups_inventory to load vars for managed-node2 44071 1727204740.76803: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204740.76820: Calling all_plugins_play to load vars for managed-node2 44071 1727204740.76824: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204740.76827: Calling groups_plugins_play to load vars for managed-node2 44071 1727204740.79083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204740.81556: done with get_vars() 44071 1727204740.81611: done getting variables 44071 1727204740.81682: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:05:40 -0400 (0:00:00.076) 0:02:33.133 ***** 44071 1727204740.81724: entering _queue_task() for managed-node2/service 44071 1727204740.82151: worker is 1 (out of 1 available) 44071 1727204740.82169: exiting _queue_task() for managed-node2/service 44071 1727204740.82184: done queuing things up, now waiting for results queue to drain 44071 1727204740.82186: waiting for pending results... 44071 1727204740.82460: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204740.82579: in run() - task 127b8e07-fff9-c964-7471-0000000024af 44071 1727204740.82593: variable 'ansible_search_path' from source: unknown 44071 1727204740.82598: variable 'ansible_search_path' from source: unknown 44071 1727204740.82631: calling self._execute() 44071 1727204740.82720: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204740.82725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204740.82738: variable 'omit' from source: magic vars 44071 1727204740.83061: variable 'ansible_distribution_major_version' from source: facts 44071 1727204740.83075: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204740.83169: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204740.83330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204740.86568: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204740.86860: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204740.86867: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204740.86870: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204740.86873: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204740.87188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.87251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.87295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.87339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.87358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.87417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.87447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.87480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.87519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.87536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.87591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.87619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.87645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.87691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.87707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.87946: variable 'network_connections' from source: include params 44071 1727204740.87951: variable 'interface' from source: play vars 44071 1727204740.88025: variable 'interface' from source: play vars 44071 1727204740.88136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204740.88314: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204740.88359: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204740.88443: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204740.88446: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204740.88486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204740.88546: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204740.88584: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.88626: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204740.88712: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204740.89196: variable 'network_connections' from source: include params 44071 1727204740.89209: variable 'interface' from source: play vars 44071 1727204740.89320: variable 'interface' from source: play vars 44071 1727204740.89349: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204740.89357: when evaluation is False, skipping this task 44071 1727204740.89375: _execute() done 44071 1727204740.89429: dumping result to json 44071 1727204740.89433: done dumping result, returning 44071 1727204740.89436: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-0000000024af] 44071 1727204740.89438: sending task result for task 127b8e07-fff9-c964-7471-0000000024af skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204740.89729: no more pending results, returning what we have 44071 1727204740.89734: results queue empty 44071 1727204740.89735: checking for any_errors_fatal 44071 1727204740.89742: done checking for any_errors_fatal 44071 1727204740.89743: checking for max_fail_percentage 44071 1727204740.89745: done checking for max_fail_percentage 44071 1727204740.89746: checking to see if all hosts have failed and the running result is not ok 44071 1727204740.89746: done checking to see if all hosts have failed 44071 1727204740.89747: getting the remaining hosts for this loop 44071 1727204740.89749: done getting the remaining hosts for this loop 44071 1727204740.89755: getting the next task for host managed-node2 44071 1727204740.89768: done getting next task for host managed-node2 44071 1727204740.89774: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204740.89781: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204740.89817: getting variables 44071 1727204740.89820: in VariableManager get_vars() 44071 1727204740.90079: Calling all_inventory to load vars for managed-node2 44071 1727204740.90082: Calling groups_inventory to load vars for managed-node2 44071 1727204740.90085: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204740.90105: Calling all_plugins_play to load vars for managed-node2 44071 1727204740.90109: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204740.90112: Calling groups_plugins_play to load vars for managed-node2 44071 1727204740.90792: done sending task result for task 127b8e07-fff9-c964-7471-0000000024af 44071 1727204740.90796: WORKER PROCESS EXITING 44071 1727204740.91807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204740.93049: done with get_vars() 44071 1727204740.93086: done getting variables 44071 1727204740.93137: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:05:40 -0400 (0:00:00.114) 0:02:33.248 ***** 44071 1727204740.93170: entering _queue_task() for managed-node2/service 44071 1727204740.93479: worker is 1 (out of 1 available) 44071 1727204740.93494: exiting _queue_task() for managed-node2/service 44071 1727204740.93508: done queuing things up, now waiting for results queue to drain 44071 1727204740.93510: waiting for pending results... 44071 1727204740.93737: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204740.93864: in run() - task 127b8e07-fff9-c964-7471-0000000024b0 44071 1727204740.93875: variable 'ansible_search_path' from source: unknown 44071 1727204740.93880: variable 'ansible_search_path' from source: unknown 44071 1727204740.93913: calling self._execute() 44071 1727204740.94010: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204740.94014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204740.94024: variable 'omit' from source: magic vars 44071 1727204740.94367: variable 'ansible_distribution_major_version' from source: facts 44071 1727204740.94382: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204740.94514: variable 'network_provider' from source: set_fact 44071 1727204740.94518: variable 'network_state' from source: role '' defaults 44071 1727204740.94530: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44071 1727204740.94538: variable 'omit' from source: magic vars 44071 1727204740.94587: variable 'omit' from source: magic vars 44071 1727204740.94611: variable 'network_service_name' from source: role '' defaults 44071 1727204740.94668: variable 'network_service_name' from source: role '' defaults 44071 1727204740.94751: variable '__network_provider_setup' from source: role '' defaults 44071 1727204740.94756: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204740.94805: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204740.94813: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204740.94871: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204740.95041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204740.97180: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204740.97232: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204740.97269: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204740.97295: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204740.97316: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204740.97388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.97410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.97430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.97466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.97479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.97516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.97533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.97554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.97587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.97598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.97785: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204740.97878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.97897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.97917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.97947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.97958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.98031: variable 'ansible_python' from source: facts 44071 1727204740.98047: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204740.98112: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204740.98174: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204740.98270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.98288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.98306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.98339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.98351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.98390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204740.98410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204740.98432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.98463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204740.98476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204740.98580: variable 'network_connections' from source: include params 44071 1727204740.98587: variable 'interface' from source: play vars 44071 1727204740.98645: variable 'interface' from source: play vars 44071 1727204740.98732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204740.98877: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204740.98916: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204740.98952: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204740.98994: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204740.99042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204740.99064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204740.99094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204740.99118: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204740.99161: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204740.99384: variable 'network_connections' from source: include params 44071 1727204740.99387: variable 'interface' from source: play vars 44071 1727204740.99451: variable 'interface' from source: play vars 44071 1727204740.99480: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204740.99542: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204740.99749: variable 'network_connections' from source: include params 44071 1727204740.99753: variable 'interface' from source: play vars 44071 1727204740.99807: variable 'interface' from source: play vars 44071 1727204740.99828: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204740.99890: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204741.00097: variable 'network_connections' from source: include params 44071 1727204741.00100: variable 'interface' from source: play vars 44071 1727204741.00155: variable 'interface' from source: play vars 44071 1727204741.00201: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204741.00246: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204741.00251: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204741.00299: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204741.00451: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204741.00798: variable 'network_connections' from source: include params 44071 1727204741.00801: variable 'interface' from source: play vars 44071 1727204741.00851: variable 'interface' from source: play vars 44071 1727204741.00858: variable 'ansible_distribution' from source: facts 44071 1727204741.00861: variable '__network_rh_distros' from source: role '' defaults 44071 1727204741.00868: variable 'ansible_distribution_major_version' from source: facts 44071 1727204741.00881: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204741.01006: variable 'ansible_distribution' from source: facts 44071 1727204741.01010: variable '__network_rh_distros' from source: role '' defaults 44071 1727204741.01015: variable 'ansible_distribution_major_version' from source: facts 44071 1727204741.01021: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204741.01147: variable 'ansible_distribution' from source: facts 44071 1727204741.01150: variable '__network_rh_distros' from source: role '' defaults 44071 1727204741.01153: variable 'ansible_distribution_major_version' from source: facts 44071 1727204741.01185: variable 'network_provider' from source: set_fact 44071 1727204741.01204: variable 'omit' from source: magic vars 44071 1727204741.01231: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204741.01257: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204741.01278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204741.01294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204741.01304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204741.01329: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204741.01332: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204741.01339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204741.01417: Set connection var ansible_connection to ssh 44071 1727204741.01423: Set connection var ansible_timeout to 10 44071 1727204741.01429: Set connection var ansible_pipelining to False 44071 1727204741.01434: Set connection var ansible_shell_type to sh 44071 1727204741.01442: Set connection var ansible_shell_executable to /bin/sh 44071 1727204741.01448: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204741.01472: variable 'ansible_shell_executable' from source: unknown 44071 1727204741.01475: variable 'ansible_connection' from source: unknown 44071 1727204741.01478: variable 'ansible_module_compression' from source: unknown 44071 1727204741.01480: variable 'ansible_shell_type' from source: unknown 44071 1727204741.01483: variable 'ansible_shell_executable' from source: unknown 44071 1727204741.01486: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204741.01489: variable 'ansible_pipelining' from source: unknown 44071 1727204741.01491: variable 'ansible_timeout' from source: unknown 44071 1727204741.01497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204741.01583: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204741.01592: variable 'omit' from source: magic vars 44071 1727204741.01598: starting attempt loop 44071 1727204741.01605: running the handler 44071 1727204741.01670: variable 'ansible_facts' from source: unknown 44071 1727204741.02250: _low_level_execute_command(): starting 44071 1727204741.02254: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204741.02809: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204741.02813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204741.02819: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204741.02822: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.02878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204741.02881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204741.02884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204741.02989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204741.04746: stdout chunk (state=3): >>>/root <<< 44071 1727204741.04854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204741.04923: stderr chunk (state=3): >>><<< 44071 1727204741.04928: stdout chunk (state=3): >>><<< 44071 1727204741.04948: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204741.04960: _low_level_execute_command(): starting 44071 1727204741.04969: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204741.0494778-53080-102699806548135 `" && echo ansible-tmp-1727204741.0494778-53080-102699806548135="` echo /root/.ansible/tmp/ansible-tmp-1727204741.0494778-53080-102699806548135 `" ) && sleep 0' 44071 1727204741.05453: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204741.05457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204741.05492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204741.05496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.05499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204741.05501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204741.05503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.05561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204741.05567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204741.05573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204741.05646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204741.07633: stdout chunk (state=3): >>>ansible-tmp-1727204741.0494778-53080-102699806548135=/root/.ansible/tmp/ansible-tmp-1727204741.0494778-53080-102699806548135 <<< 44071 1727204741.07741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204741.07808: stderr chunk (state=3): >>><<< 44071 1727204741.07812: stdout chunk (state=3): >>><<< 44071 1727204741.07827: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204741.0494778-53080-102699806548135=/root/.ansible/tmp/ansible-tmp-1727204741.0494778-53080-102699806548135 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204741.07857: variable 'ansible_module_compression' from source: unknown 44071 1727204741.07907: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44071 1727204741.07961: variable 'ansible_facts' from source: unknown 44071 1727204741.08106: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204741.0494778-53080-102699806548135/AnsiballZ_systemd.py 44071 1727204741.08233: Sending initial data 44071 1727204741.08237: Sent initial data (156 bytes) 44071 1727204741.08732: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204741.08744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.08747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204741.08750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204741.08752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.08800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204741.08803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204741.08806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204741.08886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204741.10501: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204741.10569: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204741.10642: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbviazb0x /root/.ansible/tmp/ansible-tmp-1727204741.0494778-53080-102699806548135/AnsiballZ_systemd.py <<< 44071 1727204741.10646: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204741.0494778-53080-102699806548135/AnsiballZ_systemd.py" <<< 44071 1727204741.10717: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbviazb0x" to remote "/root/.ansible/tmp/ansible-tmp-1727204741.0494778-53080-102699806548135/AnsiballZ_systemd.py" <<< 44071 1727204741.10720: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204741.0494778-53080-102699806548135/AnsiballZ_systemd.py" <<< 44071 1727204741.11983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204741.12064: stderr chunk (state=3): >>><<< 44071 1727204741.12070: stdout chunk (state=3): >>><<< 44071 1727204741.12091: done transferring module to remote 44071 1727204741.12103: _low_level_execute_command(): starting 44071 1727204741.12108: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204741.0494778-53080-102699806548135/ /root/.ansible/tmp/ansible-tmp-1727204741.0494778-53080-102699806548135/AnsiballZ_systemd.py && sleep 0' 44071 1727204741.12600: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204741.12605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204741.12608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.12662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204741.12667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204741.12741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204741.14591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204741.14654: stderr chunk (state=3): >>><<< 44071 1727204741.14658: stdout chunk (state=3): >>><<< 44071 1727204741.14673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204741.14676: _low_level_execute_command(): starting 44071 1727204741.14681: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204741.0494778-53080-102699806548135/AnsiballZ_systemd.py && sleep 0' 44071 1727204741.15163: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204741.15169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204741.15198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.15201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204741.15203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.15272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204741.15275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204741.15281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204741.15358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204741.47251: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4603904", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3524198400", "CPUUsageNSec": "1720953000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitC<<< 44071 1727204741.47258: stdout chunk (state=3): >>>ORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext":<<< 44071 1727204741.47274: stdout chunk (state=3): >>> "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44071 1727204741.49151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204741.49217: stderr chunk (state=3): >>><<< 44071 1727204741.49221: stdout chunk (state=3): >>><<< 44071 1727204741.49240: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4603904", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3524198400", "CPUUsageNSec": "1720953000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204741.49380: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204741.0494778-53080-102699806548135/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204741.49395: _low_level_execute_command(): starting 44071 1727204741.49400: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204741.0494778-53080-102699806548135/ > /dev/null 2>&1 && sleep 0' 44071 1727204741.49912: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204741.49916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.49919: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204741.49921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204741.49924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.49974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204741.49978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204741.50066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204741.51974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204741.52034: stderr chunk (state=3): >>><<< 44071 1727204741.52039: stdout chunk (state=3): >>><<< 44071 1727204741.52053: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204741.52061: handler run complete 44071 1727204741.52110: attempt loop complete, returning result 44071 1727204741.52114: _execute() done 44071 1727204741.52116: dumping result to json 44071 1727204741.52132: done dumping result, returning 44071 1727204741.52143: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-c964-7471-0000000024b0] 44071 1727204741.52148: sending task result for task 127b8e07-fff9-c964-7471-0000000024b0 44071 1727204741.52373: done sending task result for task 127b8e07-fff9-c964-7471-0000000024b0 44071 1727204741.52376: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204741.52452: no more pending results, returning what we have 44071 1727204741.52455: results queue empty 44071 1727204741.52456: checking for any_errors_fatal 44071 1727204741.52463: done checking for any_errors_fatal 44071 1727204741.52464: checking for max_fail_percentage 44071 1727204741.52467: done checking for max_fail_percentage 44071 1727204741.52468: checking to see if all hosts have failed and the running result is not ok 44071 1727204741.52469: done checking to see if all hosts have failed 44071 1727204741.52470: getting the remaining hosts for this loop 44071 1727204741.52471: done getting the remaining hosts for this loop 44071 1727204741.52475: getting the next task for host managed-node2 44071 1727204741.52483: done getting next task for host managed-node2 44071 1727204741.52488: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204741.52494: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204741.52509: getting variables 44071 1727204741.52511: in VariableManager get_vars() 44071 1727204741.52556: Calling all_inventory to load vars for managed-node2 44071 1727204741.52560: Calling groups_inventory to load vars for managed-node2 44071 1727204741.52608: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204741.52619: Calling all_plugins_play to load vars for managed-node2 44071 1727204741.52621: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204741.52624: Calling groups_plugins_play to load vars for managed-node2 44071 1727204741.53808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204741.55047: done with get_vars() 44071 1727204741.55085: done getting variables 44071 1727204741.55136: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:05:41 -0400 (0:00:00.619) 0:02:33.868 ***** 44071 1727204741.55172: entering _queue_task() for managed-node2/service 44071 1727204741.55480: worker is 1 (out of 1 available) 44071 1727204741.55497: exiting _queue_task() for managed-node2/service 44071 1727204741.55513: done queuing things up, now waiting for results queue to drain 44071 1727204741.55515: waiting for pending results... 44071 1727204741.55733: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204741.55853: in run() - task 127b8e07-fff9-c964-7471-0000000024b1 44071 1727204741.55868: variable 'ansible_search_path' from source: unknown 44071 1727204741.55874: variable 'ansible_search_path' from source: unknown 44071 1727204741.55907: calling self._execute() 44071 1727204741.56003: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204741.56008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204741.56017: variable 'omit' from source: magic vars 44071 1727204741.56353: variable 'ansible_distribution_major_version' from source: facts 44071 1727204741.56365: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204741.56459: variable 'network_provider' from source: set_fact 44071 1727204741.56462: Evaluated conditional (network_provider == "nm"): True 44071 1727204741.56534: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204741.56601: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204741.56741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204741.58439: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204741.58499: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204741.58529: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204741.58560: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204741.58584: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204741.58664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204741.58688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204741.58710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204741.58742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204741.58753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204741.58793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204741.58817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204741.58833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204741.58862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204741.58875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204741.58908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204741.58929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204741.58949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204741.58977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204741.58988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204741.59104: variable 'network_connections' from source: include params 44071 1727204741.59116: variable 'interface' from source: play vars 44071 1727204741.59177: variable 'interface' from source: play vars 44071 1727204741.59236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204741.59364: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204741.59399: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204741.59423: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204741.59448: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204741.59488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204741.59504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204741.59522: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204741.59543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204741.59588: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204741.59771: variable 'network_connections' from source: include params 44071 1727204741.59775: variable 'interface' from source: play vars 44071 1727204741.59826: variable 'interface' from source: play vars 44071 1727204741.59851: Evaluated conditional (__network_wpa_supplicant_required): False 44071 1727204741.59855: when evaluation is False, skipping this task 44071 1727204741.59857: _execute() done 44071 1727204741.59860: dumping result to json 44071 1727204741.59863: done dumping result, returning 44071 1727204741.59873: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-c964-7471-0000000024b1] 44071 1727204741.59886: sending task result for task 127b8e07-fff9-c964-7471-0000000024b1 44071 1727204741.59984: done sending task result for task 127b8e07-fff9-c964-7471-0000000024b1 44071 1727204741.59988: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44071 1727204741.60064: no more pending results, returning what we have 44071 1727204741.60070: results queue empty 44071 1727204741.60071: checking for any_errors_fatal 44071 1727204741.60093: done checking for any_errors_fatal 44071 1727204741.60094: checking for max_fail_percentage 44071 1727204741.60096: done checking for max_fail_percentage 44071 1727204741.60097: checking to see if all hosts have failed and the running result is not ok 44071 1727204741.60097: done checking to see if all hosts have failed 44071 1727204741.60098: getting the remaining hosts for this loop 44071 1727204741.60100: done getting the remaining hosts for this loop 44071 1727204741.60104: getting the next task for host managed-node2 44071 1727204741.60114: done getting next task for host managed-node2 44071 1727204741.60118: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204741.60124: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204741.60151: getting variables 44071 1727204741.60153: in VariableManager get_vars() 44071 1727204741.60206: Calling all_inventory to load vars for managed-node2 44071 1727204741.60209: Calling groups_inventory to load vars for managed-node2 44071 1727204741.60211: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204741.60221: Calling all_plugins_play to load vars for managed-node2 44071 1727204741.60223: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204741.60226: Calling groups_plugins_play to load vars for managed-node2 44071 1727204741.61427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204741.62650: done with get_vars() 44071 1727204741.62687: done getting variables 44071 1727204741.62737: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:05:41 -0400 (0:00:00.075) 0:02:33.944 ***** 44071 1727204741.62767: entering _queue_task() for managed-node2/service 44071 1727204741.63079: worker is 1 (out of 1 available) 44071 1727204741.63095: exiting _queue_task() for managed-node2/service 44071 1727204741.63110: done queuing things up, now waiting for results queue to drain 44071 1727204741.63112: waiting for pending results... 44071 1727204741.63335: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204741.63460: in run() - task 127b8e07-fff9-c964-7471-0000000024b2 44071 1727204741.63474: variable 'ansible_search_path' from source: unknown 44071 1727204741.63478: variable 'ansible_search_path' from source: unknown 44071 1727204741.63508: calling self._execute() 44071 1727204741.63602: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204741.63607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204741.63616: variable 'omit' from source: magic vars 44071 1727204741.63946: variable 'ansible_distribution_major_version' from source: facts 44071 1727204741.63957: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204741.64051: variable 'network_provider' from source: set_fact 44071 1727204741.64055: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204741.64059: when evaluation is False, skipping this task 44071 1727204741.64061: _execute() done 44071 1727204741.64067: dumping result to json 44071 1727204741.64070: done dumping result, returning 44071 1727204741.64077: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-c964-7471-0000000024b2] 44071 1727204741.64082: sending task result for task 127b8e07-fff9-c964-7471-0000000024b2 44071 1727204741.64188: done sending task result for task 127b8e07-fff9-c964-7471-0000000024b2 44071 1727204741.64191: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204741.64246: no more pending results, returning what we have 44071 1727204741.64250: results queue empty 44071 1727204741.64252: checking for any_errors_fatal 44071 1727204741.64261: done checking for any_errors_fatal 44071 1727204741.64261: checking for max_fail_percentage 44071 1727204741.64263: done checking for max_fail_percentage 44071 1727204741.64264: checking to see if all hosts have failed and the running result is not ok 44071 1727204741.64270: done checking to see if all hosts have failed 44071 1727204741.64271: getting the remaining hosts for this loop 44071 1727204741.64273: done getting the remaining hosts for this loop 44071 1727204741.64279: getting the next task for host managed-node2 44071 1727204741.64289: done getting next task for host managed-node2 44071 1727204741.64293: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204741.64299: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204741.64331: getting variables 44071 1727204741.64332: in VariableManager get_vars() 44071 1727204741.64391: Calling all_inventory to load vars for managed-node2 44071 1727204741.64394: Calling groups_inventory to load vars for managed-node2 44071 1727204741.64396: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204741.64407: Calling all_plugins_play to load vars for managed-node2 44071 1727204741.64409: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204741.64412: Calling groups_plugins_play to load vars for managed-node2 44071 1727204741.65476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204741.66683: done with get_vars() 44071 1727204741.66713: done getting variables 44071 1727204741.66770: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:05:41 -0400 (0:00:00.040) 0:02:33.984 ***** 44071 1727204741.66802: entering _queue_task() for managed-node2/copy 44071 1727204741.67112: worker is 1 (out of 1 available) 44071 1727204741.67127: exiting _queue_task() for managed-node2/copy 44071 1727204741.67140: done queuing things up, now waiting for results queue to drain 44071 1727204741.67142: waiting for pending results... 44071 1727204741.67361: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204741.67491: in run() - task 127b8e07-fff9-c964-7471-0000000024b3 44071 1727204741.67507: variable 'ansible_search_path' from source: unknown 44071 1727204741.67511: variable 'ansible_search_path' from source: unknown 44071 1727204741.67547: calling self._execute() 44071 1727204741.67643: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204741.67649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204741.67658: variable 'omit' from source: magic vars 44071 1727204741.67984: variable 'ansible_distribution_major_version' from source: facts 44071 1727204741.67996: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204741.68088: variable 'network_provider' from source: set_fact 44071 1727204741.68092: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204741.68097: when evaluation is False, skipping this task 44071 1727204741.68100: _execute() done 44071 1727204741.68102: dumping result to json 44071 1727204741.68105: done dumping result, returning 44071 1727204741.68113: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-c964-7471-0000000024b3] 44071 1727204741.68116: sending task result for task 127b8e07-fff9-c964-7471-0000000024b3 44071 1727204741.68224: done sending task result for task 127b8e07-fff9-c964-7471-0000000024b3 44071 1727204741.68227: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44071 1727204741.68291: no more pending results, returning what we have 44071 1727204741.68295: results queue empty 44071 1727204741.68296: checking for any_errors_fatal 44071 1727204741.68303: done checking for any_errors_fatal 44071 1727204741.68304: checking for max_fail_percentage 44071 1727204741.68306: done checking for max_fail_percentage 44071 1727204741.68307: checking to see if all hosts have failed and the running result is not ok 44071 1727204741.68307: done checking to see if all hosts have failed 44071 1727204741.68308: getting the remaining hosts for this loop 44071 1727204741.68310: done getting the remaining hosts for this loop 44071 1727204741.68315: getting the next task for host managed-node2 44071 1727204741.68324: done getting next task for host managed-node2 44071 1727204741.68329: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204741.68335: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204741.68368: getting variables 44071 1727204741.68369: in VariableManager get_vars() 44071 1727204741.68417: Calling all_inventory to load vars for managed-node2 44071 1727204741.68420: Calling groups_inventory to load vars for managed-node2 44071 1727204741.68422: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204741.68433: Calling all_plugins_play to load vars for managed-node2 44071 1727204741.68436: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204741.68439: Calling groups_plugins_play to load vars for managed-node2 44071 1727204741.75605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204741.76805: done with get_vars() 44071 1727204741.76834: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:05:41 -0400 (0:00:00.100) 0:02:34.085 ***** 44071 1727204741.76899: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204741.77220: worker is 1 (out of 1 available) 44071 1727204741.77235: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204741.77250: done queuing things up, now waiting for results queue to drain 44071 1727204741.77253: waiting for pending results... 44071 1727204741.77467: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204741.77601: in run() - task 127b8e07-fff9-c964-7471-0000000024b4 44071 1727204741.77617: variable 'ansible_search_path' from source: unknown 44071 1727204741.77623: variable 'ansible_search_path' from source: unknown 44071 1727204741.77656: calling self._execute() 44071 1727204741.77752: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204741.77759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204741.77771: variable 'omit' from source: magic vars 44071 1727204741.78101: variable 'ansible_distribution_major_version' from source: facts 44071 1727204741.78113: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204741.78118: variable 'omit' from source: magic vars 44071 1727204741.78175: variable 'omit' from source: magic vars 44071 1727204741.78303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204741.79974: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204741.80036: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204741.80064: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204741.80097: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204741.80119: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204741.80189: variable 'network_provider' from source: set_fact 44071 1727204741.80298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204741.80321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204741.80342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204741.80372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204741.80384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204741.80446: variable 'omit' from source: magic vars 44071 1727204741.80528: variable 'omit' from source: magic vars 44071 1727204741.80607: variable 'network_connections' from source: include params 44071 1727204741.80616: variable 'interface' from source: play vars 44071 1727204741.80668: variable 'interface' from source: play vars 44071 1727204741.80785: variable 'omit' from source: magic vars 44071 1727204741.80792: variable '__lsr_ansible_managed' from source: task vars 44071 1727204741.80839: variable '__lsr_ansible_managed' from source: task vars 44071 1727204741.80979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44071 1727204741.81314: Loaded config def from plugin (lookup/template) 44071 1727204741.81318: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44071 1727204741.81343: File lookup term: get_ansible_managed.j2 44071 1727204741.81346: variable 'ansible_search_path' from source: unknown 44071 1727204741.81352: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44071 1727204741.81366: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44071 1727204741.81380: variable 'ansible_search_path' from source: unknown 44071 1727204741.85920: variable 'ansible_managed' from source: unknown 44071 1727204741.86038: variable 'omit' from source: magic vars 44071 1727204741.86062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204741.86089: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204741.86104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204741.86119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204741.86128: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204741.86153: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204741.86156: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204741.86159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204741.86230: Set connection var ansible_connection to ssh 44071 1727204741.86237: Set connection var ansible_timeout to 10 44071 1727204741.86242: Set connection var ansible_pipelining to False 44071 1727204741.86248: Set connection var ansible_shell_type to sh 44071 1727204741.86253: Set connection var ansible_shell_executable to /bin/sh 44071 1727204741.86260: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204741.86284: variable 'ansible_shell_executable' from source: unknown 44071 1727204741.86287: variable 'ansible_connection' from source: unknown 44071 1727204741.86290: variable 'ansible_module_compression' from source: unknown 44071 1727204741.86293: variable 'ansible_shell_type' from source: unknown 44071 1727204741.86295: variable 'ansible_shell_executable' from source: unknown 44071 1727204741.86298: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204741.86301: variable 'ansible_pipelining' from source: unknown 44071 1727204741.86303: variable 'ansible_timeout' from source: unknown 44071 1727204741.86308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204741.86411: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204741.86424: variable 'omit' from source: magic vars 44071 1727204741.86427: starting attempt loop 44071 1727204741.86430: running the handler 44071 1727204741.86442: _low_level_execute_command(): starting 44071 1727204741.86448: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204741.86994: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204741.86998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.87001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204741.87003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.87061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204741.87064: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204741.87074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204741.87147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204741.88911: stdout chunk (state=3): >>>/root <<< 44071 1727204741.89018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204741.89080: stderr chunk (state=3): >>><<< 44071 1727204741.89083: stdout chunk (state=3): >>><<< 44071 1727204741.89106: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204741.89117: _low_level_execute_command(): starting 44071 1727204741.89126: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204741.8910584-53091-37010440328495 `" && echo ansible-tmp-1727204741.8910584-53091-37010440328495="` echo /root/.ansible/tmp/ansible-tmp-1727204741.8910584-53091-37010440328495 `" ) && sleep 0' 44071 1727204741.89625: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204741.89629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.89631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204741.89637: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204741.89639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.89682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204741.89685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204741.89776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204741.91774: stdout chunk (state=3): >>>ansible-tmp-1727204741.8910584-53091-37010440328495=/root/.ansible/tmp/ansible-tmp-1727204741.8910584-53091-37010440328495 <<< 44071 1727204741.92176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204741.92181: stdout chunk (state=3): >>><<< 44071 1727204741.92184: stderr chunk (state=3): >>><<< 44071 1727204741.92188: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204741.8910584-53091-37010440328495=/root/.ansible/tmp/ansible-tmp-1727204741.8910584-53091-37010440328495 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204741.92191: variable 'ansible_module_compression' from source: unknown 44071 1727204741.92193: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44071 1727204741.92196: variable 'ansible_facts' from source: unknown 44071 1727204741.92296: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204741.8910584-53091-37010440328495/AnsiballZ_network_connections.py 44071 1727204741.92557: Sending initial data 44071 1727204741.92560: Sent initial data (167 bytes) 44071 1727204741.93159: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204741.93293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204741.93304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204741.93339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204741.93446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204741.95044: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 44071 1727204741.95051: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204741.95108: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204741.95186: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpym42v91m /root/.ansible/tmp/ansible-tmp-1727204741.8910584-53091-37010440328495/AnsiballZ_network_connections.py <<< 44071 1727204741.95191: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204741.8910584-53091-37010440328495/AnsiballZ_network_connections.py" <<< 44071 1727204741.95250: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpym42v91m" to remote "/root/.ansible/tmp/ansible-tmp-1727204741.8910584-53091-37010440328495/AnsiballZ_network_connections.py" <<< 44071 1727204741.95257: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204741.8910584-53091-37010440328495/AnsiballZ_network_connections.py" <<< 44071 1727204741.96106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204741.96188: stderr chunk (state=3): >>><<< 44071 1727204741.96191: stdout chunk (state=3): >>><<< 44071 1727204741.96212: done transferring module to remote 44071 1727204741.96225: _low_level_execute_command(): starting 44071 1727204741.96235: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204741.8910584-53091-37010440328495/ /root/.ansible/tmp/ansible-tmp-1727204741.8910584-53091-37010440328495/AnsiballZ_network_connections.py && sleep 0' 44071 1727204741.96717: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204741.96721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204741.96723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.96725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204741.96728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.96779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204741.96798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204741.96862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204741.98673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204741.98734: stderr chunk (state=3): >>><<< 44071 1727204741.98738: stdout chunk (state=3): >>><<< 44071 1727204741.98754: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204741.98758: _low_level_execute_command(): starting 44071 1727204741.98762: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204741.8910584-53091-37010440328495/AnsiballZ_network_connections.py && sleep 0' 44071 1727204741.99242: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204741.99281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.99286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204741.99288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204741.99291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204741.99341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204741.99344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204741.99351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204741.99426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204742.40704: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_rymza_fl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back<<< 44071 1727204742.40896: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_rymza_fl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01: error=unknown <<< 44071 1727204742.41169: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}}<<< 44071 1727204742.41270: stdout chunk (state=3): >>> <<< 44071 1727204742.43932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204742.43999: stderr chunk (state=3): >>><<< 44071 1727204742.44003: stdout chunk (state=3): >>><<< 44071 1727204742.44022: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_rymza_fl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_rymza_fl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/bc2e78b9-9d7f-4720-aaef-6b1a6ee99c01: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204742.44054: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204741.8910584-53091-37010440328495/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204742.44062: _low_level_execute_command(): starting 44071 1727204742.44068: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204741.8910584-53091-37010440328495/ > /dev/null 2>&1 && sleep 0' 44071 1727204742.44549: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204742.44553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204742.44594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204742.44597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204742.44600: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204742.44603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204742.44605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204742.44661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204742.44664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204742.44674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204742.44743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204742.46670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204742.46723: stderr chunk (state=3): >>><<< 44071 1727204742.46726: stdout chunk (state=3): >>><<< 44071 1727204742.46746: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204742.46751: handler run complete 44071 1727204742.46779: attempt loop complete, returning result 44071 1727204742.46784: _execute() done 44071 1727204742.46787: dumping result to json 44071 1727204742.46789: done dumping result, returning 44071 1727204742.46800: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-c964-7471-0000000024b4] 44071 1727204742.46803: sending task result for task 127b8e07-fff9-c964-7471-0000000024b4 44071 1727204742.46917: done sending task result for task 127b8e07-fff9-c964-7471-0000000024b4 44071 1727204742.46920: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 44071 1727204742.47046: no more pending results, returning what we have 44071 1727204742.47050: results queue empty 44071 1727204742.47050: checking for any_errors_fatal 44071 1727204742.47058: done checking for any_errors_fatal 44071 1727204742.47059: checking for max_fail_percentage 44071 1727204742.47061: done checking for max_fail_percentage 44071 1727204742.47062: checking to see if all hosts have failed and the running result is not ok 44071 1727204742.47062: done checking to see if all hosts have failed 44071 1727204742.47063: getting the remaining hosts for this loop 44071 1727204742.47065: done getting the remaining hosts for this loop 44071 1727204742.47072: getting the next task for host managed-node2 44071 1727204742.47080: done getting next task for host managed-node2 44071 1727204742.47084: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204742.47090: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204742.47105: getting variables 44071 1727204742.47107: in VariableManager get_vars() 44071 1727204742.47159: Calling all_inventory to load vars for managed-node2 44071 1727204742.47162: Calling groups_inventory to load vars for managed-node2 44071 1727204742.47164: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204742.47177: Calling all_plugins_play to load vars for managed-node2 44071 1727204742.47180: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204742.47183: Calling groups_plugins_play to load vars for managed-node2 44071 1727204742.48385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204742.49631: done with get_vars() 44071 1727204742.49663: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:05:42 -0400 (0:00:00.728) 0:02:34.813 ***** 44071 1727204742.49736: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204742.50045: worker is 1 (out of 1 available) 44071 1727204742.50061: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204742.50078: done queuing things up, now waiting for results queue to drain 44071 1727204742.50080: waiting for pending results... 44071 1727204742.50293: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204742.50425: in run() - task 127b8e07-fff9-c964-7471-0000000024b5 44071 1727204742.50441: variable 'ansible_search_path' from source: unknown 44071 1727204742.50444: variable 'ansible_search_path' from source: unknown 44071 1727204742.50482: calling self._execute() 44071 1727204742.50580: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204742.50584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204742.50594: variable 'omit' from source: magic vars 44071 1727204742.50939: variable 'ansible_distribution_major_version' from source: facts 44071 1727204742.50952: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204742.51053: variable 'network_state' from source: role '' defaults 44071 1727204742.51064: Evaluated conditional (network_state != {}): False 44071 1727204742.51068: when evaluation is False, skipping this task 44071 1727204742.51071: _execute() done 44071 1727204742.51075: dumping result to json 44071 1727204742.51078: done dumping result, returning 44071 1727204742.51088: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-c964-7471-0000000024b5] 44071 1727204742.51091: sending task result for task 127b8e07-fff9-c964-7471-0000000024b5 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204742.51256: no more pending results, returning what we have 44071 1727204742.51260: results queue empty 44071 1727204742.51261: checking for any_errors_fatal 44071 1727204742.51282: done checking for any_errors_fatal 44071 1727204742.51283: checking for max_fail_percentage 44071 1727204742.51284: done checking for max_fail_percentage 44071 1727204742.51285: checking to see if all hosts have failed and the running result is not ok 44071 1727204742.51286: done checking to see if all hosts have failed 44071 1727204742.51287: getting the remaining hosts for this loop 44071 1727204742.51288: done getting the remaining hosts for this loop 44071 1727204742.51293: getting the next task for host managed-node2 44071 1727204742.51311: done getting next task for host managed-node2 44071 1727204742.51315: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204742.51321: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204742.51333: done sending task result for task 127b8e07-fff9-c964-7471-0000000024b5 44071 1727204742.51336: WORKER PROCESS EXITING 44071 1727204742.51357: getting variables 44071 1727204742.51359: in VariableManager get_vars() 44071 1727204742.51413: Calling all_inventory to load vars for managed-node2 44071 1727204742.51417: Calling groups_inventory to load vars for managed-node2 44071 1727204742.51419: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204742.51429: Calling all_plugins_play to load vars for managed-node2 44071 1727204742.51432: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204742.51435: Calling groups_plugins_play to load vars for managed-node2 44071 1727204742.52491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204742.53862: done with get_vars() 44071 1727204742.53888: done getting variables 44071 1727204742.53941: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:05:42 -0400 (0:00:00.042) 0:02:34.856 ***** 44071 1727204742.53974: entering _queue_task() for managed-node2/debug 44071 1727204742.54284: worker is 1 (out of 1 available) 44071 1727204742.54299: exiting _queue_task() for managed-node2/debug 44071 1727204742.54312: done queuing things up, now waiting for results queue to drain 44071 1727204742.54314: waiting for pending results... 44071 1727204742.54525: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204742.54663: in run() - task 127b8e07-fff9-c964-7471-0000000024b6 44071 1727204742.54681: variable 'ansible_search_path' from source: unknown 44071 1727204742.54685: variable 'ansible_search_path' from source: unknown 44071 1727204742.54718: calling self._execute() 44071 1727204742.54817: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204742.54823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204742.54835: variable 'omit' from source: magic vars 44071 1727204742.55156: variable 'ansible_distribution_major_version' from source: facts 44071 1727204742.55169: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204742.55175: variable 'omit' from source: magic vars 44071 1727204742.55227: variable 'omit' from source: magic vars 44071 1727204742.55255: variable 'omit' from source: magic vars 44071 1727204742.55294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204742.55328: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204742.55347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204742.55361: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204742.55374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204742.55400: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204742.55403: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204742.55406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204742.55485: Set connection var ansible_connection to ssh 44071 1727204742.55491: Set connection var ansible_timeout to 10 44071 1727204742.55497: Set connection var ansible_pipelining to False 44071 1727204742.55502: Set connection var ansible_shell_type to sh 44071 1727204742.55508: Set connection var ansible_shell_executable to /bin/sh 44071 1727204742.55515: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204742.55540: variable 'ansible_shell_executable' from source: unknown 44071 1727204742.55543: variable 'ansible_connection' from source: unknown 44071 1727204742.55546: variable 'ansible_module_compression' from source: unknown 44071 1727204742.55549: variable 'ansible_shell_type' from source: unknown 44071 1727204742.55551: variable 'ansible_shell_executable' from source: unknown 44071 1727204742.55554: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204742.55556: variable 'ansible_pipelining' from source: unknown 44071 1727204742.55558: variable 'ansible_timeout' from source: unknown 44071 1727204742.55560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204742.55678: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204742.55687: variable 'omit' from source: magic vars 44071 1727204742.55692: starting attempt loop 44071 1727204742.55695: running the handler 44071 1727204742.55801: variable '__network_connections_result' from source: set_fact 44071 1727204742.55845: handler run complete 44071 1727204742.55865: attempt loop complete, returning result 44071 1727204742.55870: _execute() done 44071 1727204742.55873: dumping result to json 44071 1727204742.55876: done dumping result, returning 44071 1727204742.55878: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-c964-7471-0000000024b6] 44071 1727204742.55885: sending task result for task 127b8e07-fff9-c964-7471-0000000024b6 ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 44071 1727204742.56075: no more pending results, returning what we have 44071 1727204742.56080: results queue empty 44071 1727204742.56081: checking for any_errors_fatal 44071 1727204742.56089: done checking for any_errors_fatal 44071 1727204742.56090: checking for max_fail_percentage 44071 1727204742.56092: done checking for max_fail_percentage 44071 1727204742.56093: checking to see if all hosts have failed and the running result is not ok 44071 1727204742.56093: done checking to see if all hosts have failed 44071 1727204742.56094: getting the remaining hosts for this loop 44071 1727204742.56096: done getting the remaining hosts for this loop 44071 1727204742.56100: getting the next task for host managed-node2 44071 1727204742.56108: done getting next task for host managed-node2 44071 1727204742.56112: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204742.56118: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204742.56131: getting variables 44071 1727204742.56134: in VariableManager get_vars() 44071 1727204742.56187: Calling all_inventory to load vars for managed-node2 44071 1727204742.56190: Calling groups_inventory to load vars for managed-node2 44071 1727204742.56192: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204742.56198: done sending task result for task 127b8e07-fff9-c964-7471-0000000024b6 44071 1727204742.56201: WORKER PROCESS EXITING 44071 1727204742.56211: Calling all_plugins_play to load vars for managed-node2 44071 1727204742.56213: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204742.56216: Calling groups_plugins_play to load vars for managed-node2 44071 1727204742.57287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204742.58547: done with get_vars() 44071 1727204742.58584: done getting variables 44071 1727204742.58637: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:05:42 -0400 (0:00:00.046) 0:02:34.903 ***** 44071 1727204742.58675: entering _queue_task() for managed-node2/debug 44071 1727204742.58989: worker is 1 (out of 1 available) 44071 1727204742.59005: exiting _queue_task() for managed-node2/debug 44071 1727204742.59019: done queuing things up, now waiting for results queue to drain 44071 1727204742.59021: waiting for pending results... 44071 1727204742.59232: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204742.59360: in run() - task 127b8e07-fff9-c964-7471-0000000024b7 44071 1727204742.59375: variable 'ansible_search_path' from source: unknown 44071 1727204742.59379: variable 'ansible_search_path' from source: unknown 44071 1727204742.59415: calling self._execute() 44071 1727204742.59509: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204742.59513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204742.59524: variable 'omit' from source: magic vars 44071 1727204742.59854: variable 'ansible_distribution_major_version' from source: facts 44071 1727204742.59866: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204742.59873: variable 'omit' from source: magic vars 44071 1727204742.59923: variable 'omit' from source: magic vars 44071 1727204742.59954: variable 'omit' from source: magic vars 44071 1727204742.59991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204742.60025: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204742.60045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204742.60060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204742.60073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204742.60098: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204742.60101: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204742.60104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204742.60186: Set connection var ansible_connection to ssh 44071 1727204742.60193: Set connection var ansible_timeout to 10 44071 1727204742.60199: Set connection var ansible_pipelining to False 44071 1727204742.60204: Set connection var ansible_shell_type to sh 44071 1727204742.60210: Set connection var ansible_shell_executable to /bin/sh 44071 1727204742.60217: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204742.60238: variable 'ansible_shell_executable' from source: unknown 44071 1727204742.60241: variable 'ansible_connection' from source: unknown 44071 1727204742.60246: variable 'ansible_module_compression' from source: unknown 44071 1727204742.60249: variable 'ansible_shell_type' from source: unknown 44071 1727204742.60251: variable 'ansible_shell_executable' from source: unknown 44071 1727204742.60253: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204742.60257: variable 'ansible_pipelining' from source: unknown 44071 1727204742.60259: variable 'ansible_timeout' from source: unknown 44071 1727204742.60261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204742.60381: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204742.60392: variable 'omit' from source: magic vars 44071 1727204742.60397: starting attempt loop 44071 1727204742.60400: running the handler 44071 1727204742.60444: variable '__network_connections_result' from source: set_fact 44071 1727204742.60515: variable '__network_connections_result' from source: set_fact 44071 1727204742.60608: handler run complete 44071 1727204742.60624: attempt loop complete, returning result 44071 1727204742.60627: _execute() done 44071 1727204742.60630: dumping result to json 44071 1727204742.60637: done dumping result, returning 44071 1727204742.60644: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-c964-7471-0000000024b7] 44071 1727204742.60649: sending task result for task 127b8e07-fff9-c964-7471-0000000024b7 44071 1727204742.60756: done sending task result for task 127b8e07-fff9-c964-7471-0000000024b7 44071 1727204742.60759: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 44071 1727204742.60878: no more pending results, returning what we have 44071 1727204742.60882: results queue empty 44071 1727204742.60882: checking for any_errors_fatal 44071 1727204742.60888: done checking for any_errors_fatal 44071 1727204742.60888: checking for max_fail_percentage 44071 1727204742.60890: done checking for max_fail_percentage 44071 1727204742.60891: checking to see if all hosts have failed and the running result is not ok 44071 1727204742.60892: done checking to see if all hosts have failed 44071 1727204742.60892: getting the remaining hosts for this loop 44071 1727204742.60894: done getting the remaining hosts for this loop 44071 1727204742.60898: getting the next task for host managed-node2 44071 1727204742.60907: done getting next task for host managed-node2 44071 1727204742.60911: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204742.60916: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204742.60930: getting variables 44071 1727204742.60932: in VariableManager get_vars() 44071 1727204742.60986: Calling all_inventory to load vars for managed-node2 44071 1727204742.60989: Calling groups_inventory to load vars for managed-node2 44071 1727204742.60991: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204742.61001: Calling all_plugins_play to load vars for managed-node2 44071 1727204742.61004: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204742.61013: Calling groups_plugins_play to load vars for managed-node2 44071 1727204742.62269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204742.63491: done with get_vars() 44071 1727204742.63522: done getting variables 44071 1727204742.63580: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:05:42 -0400 (0:00:00.049) 0:02:34.952 ***** 44071 1727204742.63610: entering _queue_task() for managed-node2/debug 44071 1727204742.63918: worker is 1 (out of 1 available) 44071 1727204742.63937: exiting _queue_task() for managed-node2/debug 44071 1727204742.63954: done queuing things up, now waiting for results queue to drain 44071 1727204742.63956: waiting for pending results... 44071 1727204742.64167: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204742.64299: in run() - task 127b8e07-fff9-c964-7471-0000000024b8 44071 1727204742.64314: variable 'ansible_search_path' from source: unknown 44071 1727204742.64317: variable 'ansible_search_path' from source: unknown 44071 1727204742.64351: calling self._execute() 44071 1727204742.64449: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204742.64453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204742.64463: variable 'omit' from source: magic vars 44071 1727204742.64785: variable 'ansible_distribution_major_version' from source: facts 44071 1727204742.64796: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204742.64893: variable 'network_state' from source: role '' defaults 44071 1727204742.64903: Evaluated conditional (network_state != {}): False 44071 1727204742.64906: when evaluation is False, skipping this task 44071 1727204742.64909: _execute() done 44071 1727204742.64912: dumping result to json 44071 1727204742.64914: done dumping result, returning 44071 1727204742.64923: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-c964-7471-0000000024b8] 44071 1727204742.64928: sending task result for task 127b8e07-fff9-c964-7471-0000000024b8 44071 1727204742.65038: done sending task result for task 127b8e07-fff9-c964-7471-0000000024b8 44071 1727204742.65040: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 44071 1727204742.65097: no more pending results, returning what we have 44071 1727204742.65101: results queue empty 44071 1727204742.65102: checking for any_errors_fatal 44071 1727204742.65113: done checking for any_errors_fatal 44071 1727204742.65114: checking for max_fail_percentage 44071 1727204742.65115: done checking for max_fail_percentage 44071 1727204742.65116: checking to see if all hosts have failed and the running result is not ok 44071 1727204742.65117: done checking to see if all hosts have failed 44071 1727204742.65118: getting the remaining hosts for this loop 44071 1727204742.65119: done getting the remaining hosts for this loop 44071 1727204742.65123: getting the next task for host managed-node2 44071 1727204742.65132: done getting next task for host managed-node2 44071 1727204742.65139: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204742.65145: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204742.65176: getting variables 44071 1727204742.65178: in VariableManager get_vars() 44071 1727204742.65225: Calling all_inventory to load vars for managed-node2 44071 1727204742.65228: Calling groups_inventory to load vars for managed-node2 44071 1727204742.65230: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204742.65243: Calling all_plugins_play to load vars for managed-node2 44071 1727204742.65245: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204742.65248: Calling groups_plugins_play to load vars for managed-node2 44071 1727204742.66325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204742.67723: done with get_vars() 44071 1727204742.67749: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:05:42 -0400 (0:00:00.042) 0:02:34.994 ***** 44071 1727204742.67837: entering _queue_task() for managed-node2/ping 44071 1727204742.68151: worker is 1 (out of 1 available) 44071 1727204742.68167: exiting _queue_task() for managed-node2/ping 44071 1727204742.68181: done queuing things up, now waiting for results queue to drain 44071 1727204742.68183: waiting for pending results... 44071 1727204742.68397: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204742.68527: in run() - task 127b8e07-fff9-c964-7471-0000000024b9 44071 1727204742.68546: variable 'ansible_search_path' from source: unknown 44071 1727204742.68550: variable 'ansible_search_path' from source: unknown 44071 1727204742.68583: calling self._execute() 44071 1727204742.68682: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204742.68686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204742.68696: variable 'omit' from source: magic vars 44071 1727204742.69026: variable 'ansible_distribution_major_version' from source: facts 44071 1727204742.69040: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204742.69046: variable 'omit' from source: magic vars 44071 1727204742.69103: variable 'omit' from source: magic vars 44071 1727204742.69129: variable 'omit' from source: magic vars 44071 1727204742.69170: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204742.69203: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204742.69222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204742.69241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204742.69252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204742.69281: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204742.69285: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204742.69288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204742.69366: Set connection var ansible_connection to ssh 44071 1727204742.69371: Set connection var ansible_timeout to 10 44071 1727204742.69380: Set connection var ansible_pipelining to False 44071 1727204742.69383: Set connection var ansible_shell_type to sh 44071 1727204742.69388: Set connection var ansible_shell_executable to /bin/sh 44071 1727204742.69395: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204742.69419: variable 'ansible_shell_executable' from source: unknown 44071 1727204742.69422: variable 'ansible_connection' from source: unknown 44071 1727204742.69425: variable 'ansible_module_compression' from source: unknown 44071 1727204742.69427: variable 'ansible_shell_type' from source: unknown 44071 1727204742.69430: variable 'ansible_shell_executable' from source: unknown 44071 1727204742.69432: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204742.69434: variable 'ansible_pipelining' from source: unknown 44071 1727204742.69440: variable 'ansible_timeout' from source: unknown 44071 1727204742.69443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204742.69614: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204742.69628: variable 'omit' from source: magic vars 44071 1727204742.69631: starting attempt loop 44071 1727204742.69634: running the handler 44071 1727204742.69650: _low_level_execute_command(): starting 44071 1727204742.69657: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204742.70211: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204742.70216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204742.70220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204742.70272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204742.70275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204742.70288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204742.70356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204742.72104: stdout chunk (state=3): >>>/root <<< 44071 1727204742.72204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204742.72274: stderr chunk (state=3): >>><<< 44071 1727204742.72278: stdout chunk (state=3): >>><<< 44071 1727204742.72302: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204742.72315: _low_level_execute_command(): starting 44071 1727204742.72323: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204742.7230124-53110-247942493158156 `" && echo ansible-tmp-1727204742.7230124-53110-247942493158156="` echo /root/.ansible/tmp/ansible-tmp-1727204742.7230124-53110-247942493158156 `" ) && sleep 0' 44071 1727204742.72833: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204742.72837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204742.72840: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204742.72851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204742.72853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204742.72897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204742.72901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204742.72911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204742.72992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204742.74995: stdout chunk (state=3): >>>ansible-tmp-1727204742.7230124-53110-247942493158156=/root/.ansible/tmp/ansible-tmp-1727204742.7230124-53110-247942493158156 <<< 44071 1727204742.75103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204742.75170: stderr chunk (state=3): >>><<< 44071 1727204742.75173: stdout chunk (state=3): >>><<< 44071 1727204742.75190: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204742.7230124-53110-247942493158156=/root/.ansible/tmp/ansible-tmp-1727204742.7230124-53110-247942493158156 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204742.75235: variable 'ansible_module_compression' from source: unknown 44071 1727204742.75281: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44071 1727204742.75315: variable 'ansible_facts' from source: unknown 44071 1727204742.75377: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204742.7230124-53110-247942493158156/AnsiballZ_ping.py 44071 1727204742.75497: Sending initial data 44071 1727204742.75500: Sent initial data (153 bytes) 44071 1727204742.75993: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204742.75997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204742.76000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204742.76004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204742.76061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204742.76065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204742.76072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204742.76140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204742.77759: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 44071 1727204742.77771: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204742.77832: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204742.77899: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbiaandyb /root/.ansible/tmp/ansible-tmp-1727204742.7230124-53110-247942493158156/AnsiballZ_ping.py <<< 44071 1727204742.77907: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204742.7230124-53110-247942493158156/AnsiballZ_ping.py" <<< 44071 1727204742.77978: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpbiaandyb" to remote "/root/.ansible/tmp/ansible-tmp-1727204742.7230124-53110-247942493158156/AnsiballZ_ping.py" <<< 44071 1727204742.77980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204742.7230124-53110-247942493158156/AnsiballZ_ping.py" <<< 44071 1727204742.78642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204742.78717: stderr chunk (state=3): >>><<< 44071 1727204742.78721: stdout chunk (state=3): >>><<< 44071 1727204742.78743: done transferring module to remote 44071 1727204742.78755: _low_level_execute_command(): starting 44071 1727204742.78758: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204742.7230124-53110-247942493158156/ /root/.ansible/tmp/ansible-tmp-1727204742.7230124-53110-247942493158156/AnsiballZ_ping.py && sleep 0' 44071 1727204742.79264: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204742.79270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204742.79273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204742.79278: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204742.79329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204742.79332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204742.79334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204742.79410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204742.81235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204742.81293: stderr chunk (state=3): >>><<< 44071 1727204742.81299: stdout chunk (state=3): >>><<< 44071 1727204742.81313: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204742.81316: _low_level_execute_command(): starting 44071 1727204742.81321: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204742.7230124-53110-247942493158156/AnsiballZ_ping.py && sleep 0' 44071 1727204742.81824: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204742.81828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204742.81830: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204742.81835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204742.81888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204742.81892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204742.81898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204742.81979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204742.98207: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44071 1727204742.99431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204742.99493: stderr chunk (state=3): >>><<< 44071 1727204742.99497: stdout chunk (state=3): >>><<< 44071 1727204742.99515: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204742.99540: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204742.7230124-53110-247942493158156/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204742.99549: _low_level_execute_command(): starting 44071 1727204742.99554: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204742.7230124-53110-247942493158156/ > /dev/null 2>&1 && sleep 0' 44071 1727204743.00054: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204743.00058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204743.00061: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204743.00063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204743.00112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204743.00116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204743.00118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204743.00194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204743.02136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204743.02193: stderr chunk (state=3): >>><<< 44071 1727204743.02197: stdout chunk (state=3): >>><<< 44071 1727204743.02214: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204743.02222: handler run complete 44071 1727204743.02238: attempt loop complete, returning result 44071 1727204743.02241: _execute() done 44071 1727204743.02243: dumping result to json 44071 1727204743.02248: done dumping result, returning 44071 1727204743.02257: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-c964-7471-0000000024b9] 44071 1727204743.02262: sending task result for task 127b8e07-fff9-c964-7471-0000000024b9 44071 1727204743.02362: done sending task result for task 127b8e07-fff9-c964-7471-0000000024b9 44071 1727204743.02368: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 44071 1727204743.02453: no more pending results, returning what we have 44071 1727204743.02457: results queue empty 44071 1727204743.02457: checking for any_errors_fatal 44071 1727204743.02469: done checking for any_errors_fatal 44071 1727204743.02470: checking for max_fail_percentage 44071 1727204743.02472: done checking for max_fail_percentage 44071 1727204743.02473: checking to see if all hosts have failed and the running result is not ok 44071 1727204743.02473: done checking to see if all hosts have failed 44071 1727204743.02474: getting the remaining hosts for this loop 44071 1727204743.02476: done getting the remaining hosts for this loop 44071 1727204743.02487: getting the next task for host managed-node2 44071 1727204743.02499: done getting next task for host managed-node2 44071 1727204743.02501: ^ task is: TASK: meta (role_complete) 44071 1727204743.02507: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204743.02522: getting variables 44071 1727204743.02524: in VariableManager get_vars() 44071 1727204743.02576: Calling all_inventory to load vars for managed-node2 44071 1727204743.02579: Calling groups_inventory to load vars for managed-node2 44071 1727204743.02581: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204743.02594: Calling all_plugins_play to load vars for managed-node2 44071 1727204743.02597: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204743.02600: Calling groups_plugins_play to load vars for managed-node2 44071 1727204743.03710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204743.04982: done with get_vars() 44071 1727204743.05016: done getting variables 44071 1727204743.05093: done queuing things up, now waiting for results queue to drain 44071 1727204743.05095: results queue empty 44071 1727204743.05095: checking for any_errors_fatal 44071 1727204743.05099: done checking for any_errors_fatal 44071 1727204743.05099: checking for max_fail_percentage 44071 1727204743.05100: done checking for max_fail_percentage 44071 1727204743.05101: checking to see if all hosts have failed and the running result is not ok 44071 1727204743.05101: done checking to see if all hosts have failed 44071 1727204743.05102: getting the remaining hosts for this loop 44071 1727204743.05102: done getting the remaining hosts for this loop 44071 1727204743.05104: getting the next task for host managed-node2 44071 1727204743.05109: done getting next task for host managed-node2 44071 1727204743.05111: ^ task is: TASK: Test 44071 1727204743.05113: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204743.05115: getting variables 44071 1727204743.05116: in VariableManager get_vars() 44071 1727204743.05127: Calling all_inventory to load vars for managed-node2 44071 1727204743.05129: Calling groups_inventory to load vars for managed-node2 44071 1727204743.05131: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204743.05137: Calling all_plugins_play to load vars for managed-node2 44071 1727204743.05139: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204743.05140: Calling groups_plugins_play to load vars for managed-node2 44071 1727204743.06154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204743.07373: done with get_vars() 44071 1727204743.07400: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 15:05:43 -0400 (0:00:00.396) 0:02:35.391 ***** 44071 1727204743.07469: entering _queue_task() for managed-node2/include_tasks 44071 1727204743.07853: worker is 1 (out of 1 available) 44071 1727204743.07873: exiting _queue_task() for managed-node2/include_tasks 44071 1727204743.07889: done queuing things up, now waiting for results queue to drain 44071 1727204743.07890: waiting for pending results... 44071 1727204743.08100: running TaskExecutor() for managed-node2/TASK: Test 44071 1727204743.08202: in run() - task 127b8e07-fff9-c964-7471-0000000020b1 44071 1727204743.08215: variable 'ansible_search_path' from source: unknown 44071 1727204743.08218: variable 'ansible_search_path' from source: unknown 44071 1727204743.08267: variable 'lsr_test' from source: include params 44071 1727204743.08461: variable 'lsr_test' from source: include params 44071 1727204743.08520: variable 'omit' from source: magic vars 44071 1727204743.08645: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204743.08654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204743.08667: variable 'omit' from source: magic vars 44071 1727204743.08862: variable 'ansible_distribution_major_version' from source: facts 44071 1727204743.08873: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204743.08884: variable 'item' from source: unknown 44071 1727204743.08934: variable 'item' from source: unknown 44071 1727204743.08960: variable 'item' from source: unknown 44071 1727204743.09011: variable 'item' from source: unknown 44071 1727204743.09167: dumping result to json 44071 1727204743.09171: done dumping result, returning 44071 1727204743.09173: done running TaskExecutor() for managed-node2/TASK: Test [127b8e07-fff9-c964-7471-0000000020b1] 44071 1727204743.09175: sending task result for task 127b8e07-fff9-c964-7471-0000000020b1 44071 1727204743.09216: done sending task result for task 127b8e07-fff9-c964-7471-0000000020b1 44071 1727204743.09219: WORKER PROCESS EXITING 44071 1727204743.09246: no more pending results, returning what we have 44071 1727204743.09251: in VariableManager get_vars() 44071 1727204743.09305: Calling all_inventory to load vars for managed-node2 44071 1727204743.09308: Calling groups_inventory to load vars for managed-node2 44071 1727204743.09312: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204743.09326: Calling all_plugins_play to load vars for managed-node2 44071 1727204743.09330: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204743.09336: Calling groups_plugins_play to load vars for managed-node2 44071 1727204743.10445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204743.11820: done with get_vars() 44071 1727204743.11843: variable 'ansible_search_path' from source: unknown 44071 1727204743.11845: variable 'ansible_search_path' from source: unknown 44071 1727204743.11883: we have included files to process 44071 1727204743.11884: generating all_blocks data 44071 1727204743.11886: done generating all_blocks data 44071 1727204743.11891: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 44071 1727204743.11892: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 44071 1727204743.11894: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 44071 1727204743.11993: done processing included file 44071 1727204743.11995: iterating over new_blocks loaded from include file 44071 1727204743.11996: in VariableManager get_vars() 44071 1727204743.12010: done with get_vars() 44071 1727204743.12012: filtering new block on tags 44071 1727204743.12031: done filtering new block on tags 44071 1727204743.12034: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed-node2 => (item=tasks/remove+down_profile.yml) 44071 1727204743.12039: extending task lists for all hosts with included blocks 44071 1727204743.12674: done extending task lists 44071 1727204743.12676: done processing included files 44071 1727204743.12676: results queue empty 44071 1727204743.12677: checking for any_errors_fatal 44071 1727204743.12678: done checking for any_errors_fatal 44071 1727204743.12679: checking for max_fail_percentage 44071 1727204743.12679: done checking for max_fail_percentage 44071 1727204743.12680: checking to see if all hosts have failed and the running result is not ok 44071 1727204743.12681: done checking to see if all hosts have failed 44071 1727204743.12681: getting the remaining hosts for this loop 44071 1727204743.12682: done getting the remaining hosts for this loop 44071 1727204743.12684: getting the next task for host managed-node2 44071 1727204743.12688: done getting next task for host managed-node2 44071 1727204743.12689: ^ task is: TASK: Include network role 44071 1727204743.12691: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204743.12693: getting variables 44071 1727204743.12694: in VariableManager get_vars() 44071 1727204743.12705: Calling all_inventory to load vars for managed-node2 44071 1727204743.12707: Calling groups_inventory to load vars for managed-node2 44071 1727204743.12709: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204743.12716: Calling all_plugins_play to load vars for managed-node2 44071 1727204743.12718: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204743.12720: Calling groups_plugins_play to load vars for managed-node2 44071 1727204743.13672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204743.14914: done with get_vars() 44071 1727204743.14947: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Tuesday 24 September 2024 15:05:43 -0400 (0:00:00.075) 0:02:35.466 ***** 44071 1727204743.15029: entering _queue_task() for managed-node2/include_role 44071 1727204743.15354: worker is 1 (out of 1 available) 44071 1727204743.15371: exiting _queue_task() for managed-node2/include_role 44071 1727204743.15385: done queuing things up, now waiting for results queue to drain 44071 1727204743.15387: waiting for pending results... 44071 1727204743.15586: running TaskExecutor() for managed-node2/TASK: Include network role 44071 1727204743.15688: in run() - task 127b8e07-fff9-c964-7471-000000002612 44071 1727204743.15701: variable 'ansible_search_path' from source: unknown 44071 1727204743.15705: variable 'ansible_search_path' from source: unknown 44071 1727204743.15743: calling self._execute() 44071 1727204743.15838: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204743.15842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204743.15846: variable 'omit' from source: magic vars 44071 1727204743.16167: variable 'ansible_distribution_major_version' from source: facts 44071 1727204743.16179: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204743.16182: _execute() done 44071 1727204743.16188: dumping result to json 44071 1727204743.16191: done dumping result, returning 44071 1727204743.16198: done running TaskExecutor() for managed-node2/TASK: Include network role [127b8e07-fff9-c964-7471-000000002612] 44071 1727204743.16203: sending task result for task 127b8e07-fff9-c964-7471-000000002612 44071 1727204743.16358: no more pending results, returning what we have 44071 1727204743.16364: in VariableManager get_vars() 44071 1727204743.16424: Calling all_inventory to load vars for managed-node2 44071 1727204743.16427: Calling groups_inventory to load vars for managed-node2 44071 1727204743.16431: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204743.16448: Calling all_plugins_play to load vars for managed-node2 44071 1727204743.16451: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204743.16453: Calling groups_plugins_play to load vars for managed-node2 44071 1727204743.16984: done sending task result for task 127b8e07-fff9-c964-7471-000000002612 44071 1727204743.16988: WORKER PROCESS EXITING 44071 1727204743.17711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204743.18947: done with get_vars() 44071 1727204743.18981: variable 'ansible_search_path' from source: unknown 44071 1727204743.18983: variable 'ansible_search_path' from source: unknown 44071 1727204743.19090: variable 'omit' from source: magic vars 44071 1727204743.19121: variable 'omit' from source: magic vars 44071 1727204743.19134: variable 'omit' from source: magic vars 44071 1727204743.19137: we have included files to process 44071 1727204743.19137: generating all_blocks data 44071 1727204743.19139: done generating all_blocks data 44071 1727204743.19139: processing included file: fedora.linux_system_roles.network 44071 1727204743.19154: in VariableManager get_vars() 44071 1727204743.19169: done with get_vars() 44071 1727204743.19193: in VariableManager get_vars() 44071 1727204743.19209: done with get_vars() 44071 1727204743.19244: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 44071 1727204743.19333: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 44071 1727204743.19388: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 44071 1727204743.19727: in VariableManager get_vars() 44071 1727204743.19749: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204743.21174: iterating over new_blocks loaded from include file 44071 1727204743.21176: in VariableManager get_vars() 44071 1727204743.21193: done with get_vars() 44071 1727204743.21195: filtering new block on tags 44071 1727204743.21385: done filtering new block on tags 44071 1727204743.21389: in VariableManager get_vars() 44071 1727204743.21401: done with get_vars() 44071 1727204743.21403: filtering new block on tags 44071 1727204743.21414: done filtering new block on tags 44071 1727204743.21416: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 44071 1727204743.21420: extending task lists for all hosts with included blocks 44071 1727204743.21500: done extending task lists 44071 1727204743.21501: done processing included files 44071 1727204743.21501: results queue empty 44071 1727204743.21502: checking for any_errors_fatal 44071 1727204743.21505: done checking for any_errors_fatal 44071 1727204743.21506: checking for max_fail_percentage 44071 1727204743.21506: done checking for max_fail_percentage 44071 1727204743.21507: checking to see if all hosts have failed and the running result is not ok 44071 1727204743.21508: done checking to see if all hosts have failed 44071 1727204743.21508: getting the remaining hosts for this loop 44071 1727204743.21509: done getting the remaining hosts for this loop 44071 1727204743.21511: getting the next task for host managed-node2 44071 1727204743.21514: done getting next task for host managed-node2 44071 1727204743.21516: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204743.21519: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204743.21528: getting variables 44071 1727204743.21529: in VariableManager get_vars() 44071 1727204743.21542: Calling all_inventory to load vars for managed-node2 44071 1727204743.21544: Calling groups_inventory to load vars for managed-node2 44071 1727204743.21545: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204743.21550: Calling all_plugins_play to load vars for managed-node2 44071 1727204743.21551: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204743.21553: Calling groups_plugins_play to load vars for managed-node2 44071 1727204743.22528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204743.23781: done with get_vars() 44071 1727204743.23814: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:05:43 -0400 (0:00:00.088) 0:02:35.555 ***** 44071 1727204743.23888: entering _queue_task() for managed-node2/include_tasks 44071 1727204743.24208: worker is 1 (out of 1 available) 44071 1727204743.24224: exiting _queue_task() for managed-node2/include_tasks 44071 1727204743.24240: done queuing things up, now waiting for results queue to drain 44071 1727204743.24242: waiting for pending results... 44071 1727204743.24455: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 44071 1727204743.24555: in run() - task 127b8e07-fff9-c964-7471-000000002694 44071 1727204743.24570: variable 'ansible_search_path' from source: unknown 44071 1727204743.24574: variable 'ansible_search_path' from source: unknown 44071 1727204743.24610: calling self._execute() 44071 1727204743.24702: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204743.24706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204743.24717: variable 'omit' from source: magic vars 44071 1727204743.25039: variable 'ansible_distribution_major_version' from source: facts 44071 1727204743.25049: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204743.25056: _execute() done 44071 1727204743.25059: dumping result to json 44071 1727204743.25061: done dumping result, returning 44071 1727204743.25071: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-c964-7471-000000002694] 44071 1727204743.25077: sending task result for task 127b8e07-fff9-c964-7471-000000002694 44071 1727204743.25182: done sending task result for task 127b8e07-fff9-c964-7471-000000002694 44071 1727204743.25185: WORKER PROCESS EXITING 44071 1727204743.25253: no more pending results, returning what we have 44071 1727204743.25259: in VariableManager get_vars() 44071 1727204743.25323: Calling all_inventory to load vars for managed-node2 44071 1727204743.25326: Calling groups_inventory to load vars for managed-node2 44071 1727204743.25328: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204743.25344: Calling all_plugins_play to load vars for managed-node2 44071 1727204743.25347: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204743.25350: Calling groups_plugins_play to load vars for managed-node2 44071 1727204743.26457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204743.27828: done with get_vars() 44071 1727204743.27853: variable 'ansible_search_path' from source: unknown 44071 1727204743.27855: variable 'ansible_search_path' from source: unknown 44071 1727204743.27891: we have included files to process 44071 1727204743.27892: generating all_blocks data 44071 1727204743.27894: done generating all_blocks data 44071 1727204743.27897: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204743.27898: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204743.27899: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 44071 1727204743.28335: done processing included file 44071 1727204743.28337: iterating over new_blocks loaded from include file 44071 1727204743.28338: in VariableManager get_vars() 44071 1727204743.28359: done with get_vars() 44071 1727204743.28360: filtering new block on tags 44071 1727204743.28385: done filtering new block on tags 44071 1727204743.28388: in VariableManager get_vars() 44071 1727204743.28404: done with get_vars() 44071 1727204743.28405: filtering new block on tags 44071 1727204743.28440: done filtering new block on tags 44071 1727204743.28442: in VariableManager get_vars() 44071 1727204743.28458: done with get_vars() 44071 1727204743.28460: filtering new block on tags 44071 1727204743.28488: done filtering new block on tags 44071 1727204743.28489: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 44071 1727204743.28494: extending task lists for all hosts with included blocks 44071 1727204743.29651: done extending task lists 44071 1727204743.29653: done processing included files 44071 1727204743.29654: results queue empty 44071 1727204743.29654: checking for any_errors_fatal 44071 1727204743.29657: done checking for any_errors_fatal 44071 1727204743.29658: checking for max_fail_percentage 44071 1727204743.29658: done checking for max_fail_percentage 44071 1727204743.29659: checking to see if all hosts have failed and the running result is not ok 44071 1727204743.29660: done checking to see if all hosts have failed 44071 1727204743.29660: getting the remaining hosts for this loop 44071 1727204743.29661: done getting the remaining hosts for this loop 44071 1727204743.29663: getting the next task for host managed-node2 44071 1727204743.29669: done getting next task for host managed-node2 44071 1727204743.29671: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204743.29674: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204743.29685: getting variables 44071 1727204743.29686: in VariableManager get_vars() 44071 1727204743.29701: Calling all_inventory to load vars for managed-node2 44071 1727204743.29703: Calling groups_inventory to load vars for managed-node2 44071 1727204743.29704: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204743.29709: Calling all_plugins_play to load vars for managed-node2 44071 1727204743.29711: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204743.29713: Calling groups_plugins_play to load vars for managed-node2 44071 1727204743.30638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204743.31895: done with get_vars() 44071 1727204743.31927: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:05:43 -0400 (0:00:00.081) 0:02:35.636 ***** 44071 1727204743.32001: entering _queue_task() for managed-node2/setup 44071 1727204743.32319: worker is 1 (out of 1 available) 44071 1727204743.32335: exiting _queue_task() for managed-node2/setup 44071 1727204743.32350: done queuing things up, now waiting for results queue to drain 44071 1727204743.32352: waiting for pending results... 44071 1727204743.32578: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 44071 1727204743.32697: in run() - task 127b8e07-fff9-c964-7471-0000000026eb 44071 1727204743.32711: variable 'ansible_search_path' from source: unknown 44071 1727204743.32715: variable 'ansible_search_path' from source: unknown 44071 1727204743.32751: calling self._execute() 44071 1727204743.32842: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204743.32846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204743.32856: variable 'omit' from source: magic vars 44071 1727204743.33182: variable 'ansible_distribution_major_version' from source: facts 44071 1727204743.33194: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204743.33370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204743.35260: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204743.35318: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204743.35354: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204743.35384: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204743.35404: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204743.35477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204743.35499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204743.35518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204743.35550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204743.35567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204743.35609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204743.35628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204743.35647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204743.35682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204743.35693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204743.35817: variable '__network_required_facts' from source: role '' defaults 44071 1727204743.35826: variable 'ansible_facts' from source: unknown 44071 1727204743.36456: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 44071 1727204743.36460: when evaluation is False, skipping this task 44071 1727204743.36463: _execute() done 44071 1727204743.36467: dumping result to json 44071 1727204743.36471: done dumping result, returning 44071 1727204743.36477: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-c964-7471-0000000026eb] 44071 1727204743.36482: sending task result for task 127b8e07-fff9-c964-7471-0000000026eb 44071 1727204743.36586: done sending task result for task 127b8e07-fff9-c964-7471-0000000026eb 44071 1727204743.36589: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204743.36642: no more pending results, returning what we have 44071 1727204743.36646: results queue empty 44071 1727204743.36647: checking for any_errors_fatal 44071 1727204743.36649: done checking for any_errors_fatal 44071 1727204743.36650: checking for max_fail_percentage 44071 1727204743.36651: done checking for max_fail_percentage 44071 1727204743.36652: checking to see if all hosts have failed and the running result is not ok 44071 1727204743.36653: done checking to see if all hosts have failed 44071 1727204743.36654: getting the remaining hosts for this loop 44071 1727204743.36655: done getting the remaining hosts for this loop 44071 1727204743.36660: getting the next task for host managed-node2 44071 1727204743.36678: done getting next task for host managed-node2 44071 1727204743.36682: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204743.36689: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204743.36723: getting variables 44071 1727204743.36725: in VariableManager get_vars() 44071 1727204743.36785: Calling all_inventory to load vars for managed-node2 44071 1727204743.36789: Calling groups_inventory to load vars for managed-node2 44071 1727204743.36791: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204743.36803: Calling all_plugins_play to load vars for managed-node2 44071 1727204743.36806: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204743.36816: Calling groups_plugins_play to load vars for managed-node2 44071 1727204743.38020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204743.39385: done with get_vars() 44071 1727204743.39422: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:05:43 -0400 (0:00:00.075) 0:02:35.711 ***** 44071 1727204743.39511: entering _queue_task() for managed-node2/stat 44071 1727204743.39835: worker is 1 (out of 1 available) 44071 1727204743.39852: exiting _queue_task() for managed-node2/stat 44071 1727204743.39869: done queuing things up, now waiting for results queue to drain 44071 1727204743.39871: waiting for pending results... 44071 1727204743.40294: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 44071 1727204743.40314: in run() - task 127b8e07-fff9-c964-7471-0000000026ed 44071 1727204743.40340: variable 'ansible_search_path' from source: unknown 44071 1727204743.40348: variable 'ansible_search_path' from source: unknown 44071 1727204743.40406: calling self._execute() 44071 1727204743.40534: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204743.40605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204743.40609: variable 'omit' from source: magic vars 44071 1727204743.41027: variable 'ansible_distribution_major_version' from source: facts 44071 1727204743.41059: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204743.41282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204743.41628: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204743.41693: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204743.41742: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204743.41787: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204743.41910: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204743.42005: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204743.42009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204743.42011: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204743.42132: variable '__network_is_ostree' from source: set_fact 44071 1727204743.42136: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204743.42139: when evaluation is False, skipping this task 44071 1727204743.42143: _execute() done 44071 1727204743.42154: dumping result to json 44071 1727204743.42157: done dumping result, returning 44071 1727204743.42165: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-c964-7471-0000000026ed] 44071 1727204743.42172: sending task result for task 127b8e07-fff9-c964-7471-0000000026ed 44071 1727204743.42277: done sending task result for task 127b8e07-fff9-c964-7471-0000000026ed 44071 1727204743.42281: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204743.42338: no more pending results, returning what we have 44071 1727204743.42347: results queue empty 44071 1727204743.42348: checking for any_errors_fatal 44071 1727204743.42361: done checking for any_errors_fatal 44071 1727204743.42362: checking for max_fail_percentage 44071 1727204743.42364: done checking for max_fail_percentage 44071 1727204743.42366: checking to see if all hosts have failed and the running result is not ok 44071 1727204743.42367: done checking to see if all hosts have failed 44071 1727204743.42368: getting the remaining hosts for this loop 44071 1727204743.42370: done getting the remaining hosts for this loop 44071 1727204743.42376: getting the next task for host managed-node2 44071 1727204743.42386: done getting next task for host managed-node2 44071 1727204743.42390: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204743.42396: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204743.42427: getting variables 44071 1727204743.42429: in VariableManager get_vars() 44071 1727204743.42491: Calling all_inventory to load vars for managed-node2 44071 1727204743.42494: Calling groups_inventory to load vars for managed-node2 44071 1727204743.42496: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204743.42508: Calling all_plugins_play to load vars for managed-node2 44071 1727204743.42510: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204743.42513: Calling groups_plugins_play to load vars for managed-node2 44071 1727204743.43744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204743.53618: done with get_vars() 44071 1727204743.53669: done getting variables 44071 1727204743.53726: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:05:43 -0400 (0:00:00.142) 0:02:35.854 ***** 44071 1727204743.53769: entering _queue_task() for managed-node2/set_fact 44071 1727204743.54201: worker is 1 (out of 1 available) 44071 1727204743.54219: exiting _queue_task() for managed-node2/set_fact 44071 1727204743.54238: done queuing things up, now waiting for results queue to drain 44071 1727204743.54240: waiting for pending results... 44071 1727204743.54534: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 44071 1727204743.54776: in run() - task 127b8e07-fff9-c964-7471-0000000026ee 44071 1727204743.54782: variable 'ansible_search_path' from source: unknown 44071 1727204743.54784: variable 'ansible_search_path' from source: unknown 44071 1727204743.54809: calling self._execute() 44071 1727204743.54926: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204743.54955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204743.54959: variable 'omit' from source: magic vars 44071 1727204743.55572: variable 'ansible_distribution_major_version' from source: facts 44071 1727204743.55577: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204743.55623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204743.55949: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204743.56010: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204743.56128: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204743.56239: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204743.56287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204743.56321: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204743.56363: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204743.56399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204743.56527: variable '__network_is_ostree' from source: set_fact 44071 1727204743.56542: Evaluated conditional (not __network_is_ostree is defined): False 44071 1727204743.56553: when evaluation is False, skipping this task 44071 1727204743.56566: _execute() done 44071 1727204743.56577: dumping result to json 44071 1727204743.56584: done dumping result, returning 44071 1727204743.56674: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-c964-7471-0000000026ee] 44071 1727204743.56678: sending task result for task 127b8e07-fff9-c964-7471-0000000026ee 44071 1727204743.56764: done sending task result for task 127b8e07-fff9-c964-7471-0000000026ee 44071 1727204743.56772: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 44071 1727204743.56826: no more pending results, returning what we have 44071 1727204743.56830: results queue empty 44071 1727204743.56833: checking for any_errors_fatal 44071 1727204743.56841: done checking for any_errors_fatal 44071 1727204743.56842: checking for max_fail_percentage 44071 1727204743.56844: done checking for max_fail_percentage 44071 1727204743.56845: checking to see if all hosts have failed and the running result is not ok 44071 1727204743.56846: done checking to see if all hosts have failed 44071 1727204743.56846: getting the remaining hosts for this loop 44071 1727204743.56848: done getting the remaining hosts for this loop 44071 1727204743.56853: getting the next task for host managed-node2 44071 1727204743.56866: done getting next task for host managed-node2 44071 1727204743.56871: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204743.56878: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204743.56912: getting variables 44071 1727204743.56914: in VariableManager get_vars() 44071 1727204743.57158: Calling all_inventory to load vars for managed-node2 44071 1727204743.57161: Calling groups_inventory to load vars for managed-node2 44071 1727204743.57165: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204743.57181: Calling all_plugins_play to load vars for managed-node2 44071 1727204743.57184: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204743.57188: Calling groups_plugins_play to load vars for managed-node2 44071 1727204743.59127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204743.61359: done with get_vars() 44071 1727204743.61393: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:05:43 -0400 (0:00:00.077) 0:02:35.931 ***** 44071 1727204743.61479: entering _queue_task() for managed-node2/service_facts 44071 1727204743.61791: worker is 1 (out of 1 available) 44071 1727204743.61807: exiting _queue_task() for managed-node2/service_facts 44071 1727204743.61822: done queuing things up, now waiting for results queue to drain 44071 1727204743.61824: waiting for pending results... 44071 1727204743.62032: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 44071 1727204743.62151: in run() - task 127b8e07-fff9-c964-7471-0000000026f0 44071 1727204743.62168: variable 'ansible_search_path' from source: unknown 44071 1727204743.62172: variable 'ansible_search_path' from source: unknown 44071 1727204743.62208: calling self._execute() 44071 1727204743.62306: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204743.62314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204743.62322: variable 'omit' from source: magic vars 44071 1727204743.62662: variable 'ansible_distribution_major_version' from source: facts 44071 1727204743.62687: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204743.62712: variable 'omit' from source: magic vars 44071 1727204743.62778: variable 'omit' from source: magic vars 44071 1727204743.62872: variable 'omit' from source: magic vars 44071 1727204743.62875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204743.62910: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204743.62938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204743.63008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204743.63012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204743.63015: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204743.63017: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204743.63019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204743.63117: Set connection var ansible_connection to ssh 44071 1727204743.63121: Set connection var ansible_timeout to 10 44071 1727204743.63142: Set connection var ansible_pipelining to False 44071 1727204743.63146: Set connection var ansible_shell_type to sh 44071 1727204743.63149: Set connection var ansible_shell_executable to /bin/sh 44071 1727204743.63171: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204743.63226: variable 'ansible_shell_executable' from source: unknown 44071 1727204743.63233: variable 'ansible_connection' from source: unknown 44071 1727204743.63237: variable 'ansible_module_compression' from source: unknown 44071 1727204743.63247: variable 'ansible_shell_type' from source: unknown 44071 1727204743.63250: variable 'ansible_shell_executable' from source: unknown 44071 1727204743.63253: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204743.63255: variable 'ansible_pipelining' from source: unknown 44071 1727204743.63258: variable 'ansible_timeout' from source: unknown 44071 1727204743.63260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204743.63554: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204743.63559: variable 'omit' from source: magic vars 44071 1727204743.63562: starting attempt loop 44071 1727204743.63565: running the handler 44071 1727204743.63573: _low_level_execute_command(): starting 44071 1727204743.63576: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204743.64333: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204743.64339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204743.64344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204743.64398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204743.64406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204743.64489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204743.66281: stdout chunk (state=3): >>>/root <<< 44071 1727204743.66508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204743.66512: stdout chunk (state=3): >>><<< 44071 1727204743.66515: stderr chunk (state=3): >>><<< 44071 1727204743.66544: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204743.66673: _low_level_execute_command(): starting 44071 1727204743.66677: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204743.665533-53134-136277637079542 `" && echo ansible-tmp-1727204743.665533-53134-136277637079542="` echo /root/.ansible/tmp/ansible-tmp-1727204743.665533-53134-136277637079542 `" ) && sleep 0' 44071 1727204743.67326: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204743.67353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204743.67464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204743.67535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204743.67555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204743.67558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204743.67655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204743.69648: stdout chunk (state=3): >>>ansible-tmp-1727204743.665533-53134-136277637079542=/root/.ansible/tmp/ansible-tmp-1727204743.665533-53134-136277637079542 <<< 44071 1727204743.69797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204743.69890: stderr chunk (state=3): >>><<< 44071 1727204743.69921: stdout chunk (state=3): >>><<< 44071 1727204743.69984: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204743.665533-53134-136277637079542=/root/.ansible/tmp/ansible-tmp-1727204743.665533-53134-136277637079542 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204743.70027: variable 'ansible_module_compression' from source: unknown 44071 1727204743.70082: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 44071 1727204743.70148: variable 'ansible_facts' from source: unknown 44071 1727204743.70263: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204743.665533-53134-136277637079542/AnsiballZ_service_facts.py 44071 1727204743.70495: Sending initial data 44071 1727204743.70498: Sent initial data (161 bytes) 44071 1727204743.71160: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204743.71192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204743.71298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204743.72905: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204743.72970: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204743.73037: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpgu5t1ytm /root/.ansible/tmp/ansible-tmp-1727204743.665533-53134-136277637079542/AnsiballZ_service_facts.py <<< 44071 1727204743.73046: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204743.665533-53134-136277637079542/AnsiballZ_service_facts.py" <<< 44071 1727204743.73107: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpgu5t1ytm" to remote "/root/.ansible/tmp/ansible-tmp-1727204743.665533-53134-136277637079542/AnsiballZ_service_facts.py" <<< 44071 1727204743.73115: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204743.665533-53134-136277637079542/AnsiballZ_service_facts.py" <<< 44071 1727204743.73922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204743.74089: stderr chunk (state=3): >>><<< 44071 1727204743.74094: stdout chunk (state=3): >>><<< 44071 1727204743.74096: done transferring module to remote 44071 1727204743.74098: _low_level_execute_command(): starting 44071 1727204743.74101: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204743.665533-53134-136277637079542/ /root/.ansible/tmp/ansible-tmp-1727204743.665533-53134-136277637079542/AnsiballZ_service_facts.py && sleep 0' 44071 1727204743.74816: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204743.74929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204743.74977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204743.75047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204743.76876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204743.76938: stderr chunk (state=3): >>><<< 44071 1727204743.76943: stdout chunk (state=3): >>><<< 44071 1727204743.76960: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204743.76970: _low_level_execute_command(): starting 44071 1727204743.76973: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204743.665533-53134-136277637079542/AnsiballZ_service_facts.py && sleep 0' 44071 1727204743.77487: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204743.77492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204743.77497: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204743.77499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204743.77552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204743.77556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204743.77560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204743.77644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204745.99355: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.servi<<< 44071 1727204745.99377: stdout chunk (state=3): >>>ce", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 44071 1727204746.00918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204746.00983: stderr chunk (state=3): >>><<< 44071 1727204746.00987: stdout chunk (state=3): >>><<< 44071 1727204746.01014: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204746.01974: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204743.665533-53134-136277637079542/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204746.01980: _low_level_execute_command(): starting 44071 1727204746.01986: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204743.665533-53134-136277637079542/ > /dev/null 2>&1 && sleep 0' 44071 1727204746.02500: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204746.02505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204746.02508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204746.02570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204746.02574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204746.02583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204746.02649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204746.04678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204746.04703: stderr chunk (state=3): >>><<< 44071 1727204746.04715: stdout chunk (state=3): >>><<< 44071 1727204746.04742: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204746.04758: handler run complete 44071 1727204746.05073: variable 'ansible_facts' from source: unknown 44071 1727204746.05269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204746.05906: variable 'ansible_facts' from source: unknown 44071 1727204746.06087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204746.06380: attempt loop complete, returning result 44071 1727204746.06434: _execute() done 44071 1727204746.06438: dumping result to json 44071 1727204746.06488: done dumping result, returning 44071 1727204746.06506: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-c964-7471-0000000026f0] 44071 1727204746.06517: sending task result for task 127b8e07-fff9-c964-7471-0000000026f0 44071 1727204746.08398: done sending task result for task 127b8e07-fff9-c964-7471-0000000026f0 44071 1727204746.08404: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204746.08526: no more pending results, returning what we have 44071 1727204746.08529: results queue empty 44071 1727204746.08530: checking for any_errors_fatal 44071 1727204746.08535: done checking for any_errors_fatal 44071 1727204746.08535: checking for max_fail_percentage 44071 1727204746.08537: done checking for max_fail_percentage 44071 1727204746.08538: checking to see if all hosts have failed and the running result is not ok 44071 1727204746.08539: done checking to see if all hosts have failed 44071 1727204746.08540: getting the remaining hosts for this loop 44071 1727204746.08541: done getting the remaining hosts for this loop 44071 1727204746.08545: getting the next task for host managed-node2 44071 1727204746.08552: done getting next task for host managed-node2 44071 1727204746.08557: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204746.08569: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204746.08584: getting variables 44071 1727204746.08585: in VariableManager get_vars() 44071 1727204746.08624: Calling all_inventory to load vars for managed-node2 44071 1727204746.08627: Calling groups_inventory to load vars for managed-node2 44071 1727204746.08630: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204746.08640: Calling all_plugins_play to load vars for managed-node2 44071 1727204746.08643: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204746.08653: Calling groups_plugins_play to load vars for managed-node2 44071 1727204746.09793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204746.11042: done with get_vars() 44071 1727204746.11075: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:05:46 -0400 (0:00:02.496) 0:02:38.428 ***** 44071 1727204746.11156: entering _queue_task() for managed-node2/package_facts 44071 1727204746.11467: worker is 1 (out of 1 available) 44071 1727204746.11484: exiting _queue_task() for managed-node2/package_facts 44071 1727204746.11500: done queuing things up, now waiting for results queue to drain 44071 1727204746.11501: waiting for pending results... 44071 1727204746.11720: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 44071 1727204746.11847: in run() - task 127b8e07-fff9-c964-7471-0000000026f1 44071 1727204746.11858: variable 'ansible_search_path' from source: unknown 44071 1727204746.11861: variable 'ansible_search_path' from source: unknown 44071 1727204746.11896: calling self._execute() 44071 1727204746.11995: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204746.12000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204746.12009: variable 'omit' from source: magic vars 44071 1727204746.12369: variable 'ansible_distribution_major_version' from source: facts 44071 1727204746.12383: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204746.12388: variable 'omit' from source: magic vars 44071 1727204746.12460: variable 'omit' from source: magic vars 44071 1727204746.12488: variable 'omit' from source: magic vars 44071 1727204746.12529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204746.12561: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204746.12581: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204746.12596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204746.12612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204746.12641: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204746.12645: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204746.12648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204746.12730: Set connection var ansible_connection to ssh 44071 1727204746.12733: Set connection var ansible_timeout to 10 44071 1727204746.12741: Set connection var ansible_pipelining to False 44071 1727204746.12747: Set connection var ansible_shell_type to sh 44071 1727204746.12752: Set connection var ansible_shell_executable to /bin/sh 44071 1727204746.12759: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204746.12781: variable 'ansible_shell_executable' from source: unknown 44071 1727204746.12784: variable 'ansible_connection' from source: unknown 44071 1727204746.12787: variable 'ansible_module_compression' from source: unknown 44071 1727204746.12790: variable 'ansible_shell_type' from source: unknown 44071 1727204746.12792: variable 'ansible_shell_executable' from source: unknown 44071 1727204746.12794: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204746.12799: variable 'ansible_pipelining' from source: unknown 44071 1727204746.12802: variable 'ansible_timeout' from source: unknown 44071 1727204746.12806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204746.12979: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204746.12989: variable 'omit' from source: magic vars 44071 1727204746.12994: starting attempt loop 44071 1727204746.12997: running the handler 44071 1727204746.13011: _low_level_execute_command(): starting 44071 1727204746.13018: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204746.13599: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204746.13606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204746.13610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204746.13663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204746.13669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204746.13675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204746.13747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204746.15422: stdout chunk (state=3): >>>/root <<< 44071 1727204746.15518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204746.15588: stderr chunk (state=3): >>><<< 44071 1727204746.15592: stdout chunk (state=3): >>><<< 44071 1727204746.15617: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204746.15631: _low_level_execute_command(): starting 44071 1727204746.15635: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204746.1561406-53188-235653541979941 `" && echo ansible-tmp-1727204746.1561406-53188-235653541979941="` echo /root/.ansible/tmp/ansible-tmp-1727204746.1561406-53188-235653541979941 `" ) && sleep 0' 44071 1727204746.16150: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204746.16154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204746.16157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204746.16171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204746.16174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204746.16219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204746.16228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204746.16231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204746.16297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204746.18280: stdout chunk (state=3): >>>ansible-tmp-1727204746.1561406-53188-235653541979941=/root/.ansible/tmp/ansible-tmp-1727204746.1561406-53188-235653541979941 <<< 44071 1727204746.18388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204746.18449: stderr chunk (state=3): >>><<< 44071 1727204746.18452: stdout chunk (state=3): >>><<< 44071 1727204746.18470: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204746.1561406-53188-235653541979941=/root/.ansible/tmp/ansible-tmp-1727204746.1561406-53188-235653541979941 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204746.18523: variable 'ansible_module_compression' from source: unknown 44071 1727204746.18564: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 44071 1727204746.18625: variable 'ansible_facts' from source: unknown 44071 1727204746.18746: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204746.1561406-53188-235653541979941/AnsiballZ_package_facts.py 44071 1727204746.18875: Sending initial data 44071 1727204746.18879: Sent initial data (162 bytes) 44071 1727204746.19380: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204746.19384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204746.19387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204746.19390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204746.19392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204746.19444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204746.19448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204746.19450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204746.19530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204746.21137: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204746.21222: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204746.21317: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpk1yyfd09 /root/.ansible/tmp/ansible-tmp-1727204746.1561406-53188-235653541979941/AnsiballZ_package_facts.py <<< 44071 1727204746.21321: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204746.1561406-53188-235653541979941/AnsiballZ_package_facts.py" <<< 44071 1727204746.21389: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpk1yyfd09" to remote "/root/.ansible/tmp/ansible-tmp-1727204746.1561406-53188-235653541979941/AnsiballZ_package_facts.py" <<< 44071 1727204746.21397: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204746.1561406-53188-235653541979941/AnsiballZ_package_facts.py" <<< 44071 1727204746.22613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204746.22693: stderr chunk (state=3): >>><<< 44071 1727204746.22697: stdout chunk (state=3): >>><<< 44071 1727204746.22718: done transferring module to remote 44071 1727204746.22729: _low_level_execute_command(): starting 44071 1727204746.22737: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204746.1561406-53188-235653541979941/ /root/.ansible/tmp/ansible-tmp-1727204746.1561406-53188-235653541979941/AnsiballZ_package_facts.py && sleep 0' 44071 1727204746.23235: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204746.23239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204746.23243: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204746.23249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204746.23297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204746.23301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204746.23307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204746.23387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204746.25198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204746.25261: stderr chunk (state=3): >>><<< 44071 1727204746.25265: stdout chunk (state=3): >>><<< 44071 1727204746.25279: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204746.25282: _low_level_execute_command(): starting 44071 1727204746.25288: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204746.1561406-53188-235653541979941/AnsiballZ_package_facts.py && sleep 0' 44071 1727204746.25803: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204746.25807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204746.25809: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204746.25814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204746.25874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204746.25880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204746.25956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204746.88698: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 44071 1727204746.88721: stdout chunk (state=3): >>>me": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40",<<< 44071 1727204746.88783: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source"<<< 44071 1727204746.88802: stdout chunk (state=3): >>>: "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1"<<< 44071 1727204746.88872: stdout chunk (state=3): >>>, "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "<<< 44071 1727204746.88881: stdout chunk (state=3): >>>rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50<<< 44071 1727204746.88907: stdout chunk (state=3): >>>, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "s<<< 44071 1727204746.88936: stdout chunk (state=3): >>>ource": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-t<<< 44071 1727204746.88940: stdout chunk (state=3): >>>ools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 44071 1727204746.90789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204746.90850: stderr chunk (state=3): >>><<< 44071 1727204746.90853: stdout chunk (state=3): >>><<< 44071 1727204746.90899: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204746.92744: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204746.1561406-53188-235653541979941/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204746.92760: _low_level_execute_command(): starting 44071 1727204746.92766: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204746.1561406-53188-235653541979941/ > /dev/null 2>&1 && sleep 0' 44071 1727204746.93273: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204746.93278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204746.93281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204746.93283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204746.93344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204746.93348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204746.93350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204746.93413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204746.95386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204746.95442: stderr chunk (state=3): >>><<< 44071 1727204746.95446: stdout chunk (state=3): >>><<< 44071 1727204746.95459: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204746.95467: handler run complete 44071 1727204746.96165: variable 'ansible_facts' from source: unknown 44071 1727204746.96537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204746.98046: variable 'ansible_facts' from source: unknown 44071 1727204746.98406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204746.98976: attempt loop complete, returning result 44071 1727204746.98990: _execute() done 44071 1727204746.98993: dumping result to json 44071 1727204746.99149: done dumping result, returning 44071 1727204746.99157: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-c964-7471-0000000026f1] 44071 1727204746.99162: sending task result for task 127b8e07-fff9-c964-7471-0000000026f1 44071 1727204747.01182: done sending task result for task 127b8e07-fff9-c964-7471-0000000026f1 44071 1727204747.01186: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204747.01304: no more pending results, returning what we have 44071 1727204747.01306: results queue empty 44071 1727204747.01307: checking for any_errors_fatal 44071 1727204747.01311: done checking for any_errors_fatal 44071 1727204747.01311: checking for max_fail_percentage 44071 1727204747.01312: done checking for max_fail_percentage 44071 1727204747.01313: checking to see if all hosts have failed and the running result is not ok 44071 1727204747.01314: done checking to see if all hosts have failed 44071 1727204747.01314: getting the remaining hosts for this loop 44071 1727204747.01315: done getting the remaining hosts for this loop 44071 1727204747.01318: getting the next task for host managed-node2 44071 1727204747.01323: done getting next task for host managed-node2 44071 1727204747.01326: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204747.01331: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204747.01343: getting variables 44071 1727204747.01344: in VariableManager get_vars() 44071 1727204747.01375: Calling all_inventory to load vars for managed-node2 44071 1727204747.01377: Calling groups_inventory to load vars for managed-node2 44071 1727204747.01379: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204747.01387: Calling all_plugins_play to load vars for managed-node2 44071 1727204747.01389: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204747.01391: Calling groups_plugins_play to load vars for managed-node2 44071 1727204747.02323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204747.03657: done with get_vars() 44071 1727204747.03684: done getting variables 44071 1727204747.03744: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:05:47 -0400 (0:00:00.926) 0:02:39.354 ***** 44071 1727204747.03776: entering _queue_task() for managed-node2/debug 44071 1727204747.04083: worker is 1 (out of 1 available) 44071 1727204747.04099: exiting _queue_task() for managed-node2/debug 44071 1727204747.04114: done queuing things up, now waiting for results queue to drain 44071 1727204747.04116: waiting for pending results... 44071 1727204747.04345: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 44071 1727204747.04470: in run() - task 127b8e07-fff9-c964-7471-000000002695 44071 1727204747.04483: variable 'ansible_search_path' from source: unknown 44071 1727204747.04488: variable 'ansible_search_path' from source: unknown 44071 1727204747.04524: calling self._execute() 44071 1727204747.04616: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204747.04620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204747.04631: variable 'omit' from source: magic vars 44071 1727204747.04977: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.04988: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204747.04994: variable 'omit' from source: magic vars 44071 1727204747.05051: variable 'omit' from source: magic vars 44071 1727204747.05136: variable 'network_provider' from source: set_fact 44071 1727204747.05152: variable 'omit' from source: magic vars 44071 1727204747.05192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204747.05225: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204747.05245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204747.05259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204747.05273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204747.05302: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204747.05305: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204747.05308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204747.05389: Set connection var ansible_connection to ssh 44071 1727204747.05393: Set connection var ansible_timeout to 10 44071 1727204747.05402: Set connection var ansible_pipelining to False 44071 1727204747.05405: Set connection var ansible_shell_type to sh 44071 1727204747.05411: Set connection var ansible_shell_executable to /bin/sh 44071 1727204747.05418: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204747.05442: variable 'ansible_shell_executable' from source: unknown 44071 1727204747.05446: variable 'ansible_connection' from source: unknown 44071 1727204747.05449: variable 'ansible_module_compression' from source: unknown 44071 1727204747.05451: variable 'ansible_shell_type' from source: unknown 44071 1727204747.05454: variable 'ansible_shell_executable' from source: unknown 44071 1727204747.05456: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204747.05460: variable 'ansible_pipelining' from source: unknown 44071 1727204747.05467: variable 'ansible_timeout' from source: unknown 44071 1727204747.05471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204747.05592: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204747.05602: variable 'omit' from source: magic vars 44071 1727204747.05608: starting attempt loop 44071 1727204747.05613: running the handler 44071 1727204747.05657: handler run complete 44071 1727204747.05672: attempt loop complete, returning result 44071 1727204747.05675: _execute() done 44071 1727204747.05678: dumping result to json 44071 1727204747.05680: done dumping result, returning 44071 1727204747.05689: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-c964-7471-000000002695] 44071 1727204747.05693: sending task result for task 127b8e07-fff9-c964-7471-000000002695 44071 1727204747.05787: done sending task result for task 127b8e07-fff9-c964-7471-000000002695 44071 1727204747.05790: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 44071 1727204747.05879: no more pending results, returning what we have 44071 1727204747.05882: results queue empty 44071 1727204747.05883: checking for any_errors_fatal 44071 1727204747.05896: done checking for any_errors_fatal 44071 1727204747.05897: checking for max_fail_percentage 44071 1727204747.05905: done checking for max_fail_percentage 44071 1727204747.05906: checking to see if all hosts have failed and the running result is not ok 44071 1727204747.05907: done checking to see if all hosts have failed 44071 1727204747.05908: getting the remaining hosts for this loop 44071 1727204747.05909: done getting the remaining hosts for this loop 44071 1727204747.05914: getting the next task for host managed-node2 44071 1727204747.05922: done getting next task for host managed-node2 44071 1727204747.05926: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204747.05932: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204747.05947: getting variables 44071 1727204747.05948: in VariableManager get_vars() 44071 1727204747.05996: Calling all_inventory to load vars for managed-node2 44071 1727204747.05999: Calling groups_inventory to load vars for managed-node2 44071 1727204747.06001: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204747.06021: Calling all_plugins_play to load vars for managed-node2 44071 1727204747.06024: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204747.06027: Calling groups_plugins_play to load vars for managed-node2 44071 1727204747.07073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204747.08307: done with get_vars() 44071 1727204747.08342: done getting variables 44071 1727204747.08394: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:05:47 -0400 (0:00:00.046) 0:02:39.400 ***** 44071 1727204747.08433: entering _queue_task() for managed-node2/fail 44071 1727204747.08748: worker is 1 (out of 1 available) 44071 1727204747.08763: exiting _queue_task() for managed-node2/fail 44071 1727204747.08780: done queuing things up, now waiting for results queue to drain 44071 1727204747.08782: waiting for pending results... 44071 1727204747.08993: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 44071 1727204747.09107: in run() - task 127b8e07-fff9-c964-7471-000000002696 44071 1727204747.09272: variable 'ansible_search_path' from source: unknown 44071 1727204747.09276: variable 'ansible_search_path' from source: unknown 44071 1727204747.09279: calling self._execute() 44071 1727204747.09300: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204747.09314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204747.09329: variable 'omit' from source: magic vars 44071 1727204747.09784: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.09806: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204747.09952: variable 'network_state' from source: role '' defaults 44071 1727204747.09975: Evaluated conditional (network_state != {}): False 44071 1727204747.09986: when evaluation is False, skipping this task 44071 1727204747.09994: _execute() done 44071 1727204747.10004: dumping result to json 44071 1727204747.10015: done dumping result, returning 44071 1727204747.10032: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-c964-7471-000000002696] 44071 1727204747.10046: sending task result for task 127b8e07-fff9-c964-7471-000000002696 44071 1727204747.10327: done sending task result for task 127b8e07-fff9-c964-7471-000000002696 44071 1727204747.10331: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204747.10396: no more pending results, returning what we have 44071 1727204747.10401: results queue empty 44071 1727204747.10402: checking for any_errors_fatal 44071 1727204747.10413: done checking for any_errors_fatal 44071 1727204747.10414: checking for max_fail_percentage 44071 1727204747.10416: done checking for max_fail_percentage 44071 1727204747.10417: checking to see if all hosts have failed and the running result is not ok 44071 1727204747.10418: done checking to see if all hosts have failed 44071 1727204747.10418: getting the remaining hosts for this loop 44071 1727204747.10420: done getting the remaining hosts for this loop 44071 1727204747.10426: getting the next task for host managed-node2 44071 1727204747.10437: done getting next task for host managed-node2 44071 1727204747.10442: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204747.10451: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204747.10494: getting variables 44071 1727204747.10496: in VariableManager get_vars() 44071 1727204747.10559: Calling all_inventory to load vars for managed-node2 44071 1727204747.10563: Calling groups_inventory to load vars for managed-node2 44071 1727204747.10770: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204747.10782: Calling all_plugins_play to load vars for managed-node2 44071 1727204747.10786: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204747.10789: Calling groups_plugins_play to load vars for managed-node2 44071 1727204747.12789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204747.14690: done with get_vars() 44071 1727204747.14724: done getting variables 44071 1727204747.14778: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:05:47 -0400 (0:00:00.063) 0:02:39.464 ***** 44071 1727204747.14809: entering _queue_task() for managed-node2/fail 44071 1727204747.15117: worker is 1 (out of 1 available) 44071 1727204747.15134: exiting _queue_task() for managed-node2/fail 44071 1727204747.15149: done queuing things up, now waiting for results queue to drain 44071 1727204747.15150: waiting for pending results... 44071 1727204747.15367: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 44071 1727204747.15481: in run() - task 127b8e07-fff9-c964-7471-000000002697 44071 1727204747.15498: variable 'ansible_search_path' from source: unknown 44071 1727204747.15502: variable 'ansible_search_path' from source: unknown 44071 1727204747.15533: calling self._execute() 44071 1727204747.15626: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204747.15634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204747.15642: variable 'omit' from source: magic vars 44071 1727204747.15975: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.15986: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204747.16086: variable 'network_state' from source: role '' defaults 44071 1727204747.16097: Evaluated conditional (network_state != {}): False 44071 1727204747.16101: when evaluation is False, skipping this task 44071 1727204747.16104: _execute() done 44071 1727204747.16107: dumping result to json 44071 1727204747.16110: done dumping result, returning 44071 1727204747.16118: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-c964-7471-000000002697] 44071 1727204747.16123: sending task result for task 127b8e07-fff9-c964-7471-000000002697 44071 1727204747.16235: done sending task result for task 127b8e07-fff9-c964-7471-000000002697 44071 1727204747.16239: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204747.16303: no more pending results, returning what we have 44071 1727204747.16309: results queue empty 44071 1727204747.16310: checking for any_errors_fatal 44071 1727204747.16321: done checking for any_errors_fatal 44071 1727204747.16322: checking for max_fail_percentage 44071 1727204747.16323: done checking for max_fail_percentage 44071 1727204747.16324: checking to see if all hosts have failed and the running result is not ok 44071 1727204747.16325: done checking to see if all hosts have failed 44071 1727204747.16326: getting the remaining hosts for this loop 44071 1727204747.16328: done getting the remaining hosts for this loop 44071 1727204747.16332: getting the next task for host managed-node2 44071 1727204747.16343: done getting next task for host managed-node2 44071 1727204747.16348: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204747.16354: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204747.16386: getting variables 44071 1727204747.16388: in VariableManager get_vars() 44071 1727204747.16436: Calling all_inventory to load vars for managed-node2 44071 1727204747.16439: Calling groups_inventory to load vars for managed-node2 44071 1727204747.16441: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204747.16452: Calling all_plugins_play to load vars for managed-node2 44071 1727204747.16455: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204747.16458: Calling groups_plugins_play to load vars for managed-node2 44071 1727204747.17510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204747.18890: done with get_vars() 44071 1727204747.18915: done getting variables 44071 1727204747.18967: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:05:47 -0400 (0:00:00.041) 0:02:39.506 ***** 44071 1727204747.18998: entering _queue_task() for managed-node2/fail 44071 1727204747.19306: worker is 1 (out of 1 available) 44071 1727204747.19322: exiting _queue_task() for managed-node2/fail 44071 1727204747.19337: done queuing things up, now waiting for results queue to drain 44071 1727204747.19339: waiting for pending results... 44071 1727204747.19572: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 44071 1727204747.19689: in run() - task 127b8e07-fff9-c964-7471-000000002698 44071 1727204747.19703: variable 'ansible_search_path' from source: unknown 44071 1727204747.19706: variable 'ansible_search_path' from source: unknown 44071 1727204747.19740: calling self._execute() 44071 1727204747.19833: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204747.19845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204747.19854: variable 'omit' from source: magic vars 44071 1727204747.20189: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.20200: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204747.20342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204747.22137: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204747.22189: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204747.22221: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204747.22250: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204747.22272: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204747.22343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.22377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.22396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.22428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.22445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.22528: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.22548: Evaluated conditional (ansible_distribution_major_version | int > 9): True 44071 1727204747.22646: variable 'ansible_distribution' from source: facts 44071 1727204747.22655: variable '__network_rh_distros' from source: role '' defaults 44071 1727204747.22665: Evaluated conditional (ansible_distribution in __network_rh_distros): False 44071 1727204747.22668: when evaluation is False, skipping this task 44071 1727204747.22672: _execute() done 44071 1727204747.22677: dumping result to json 44071 1727204747.22680: done dumping result, returning 44071 1727204747.22689: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-c964-7471-000000002698] 44071 1727204747.22693: sending task result for task 127b8e07-fff9-c964-7471-000000002698 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 44071 1727204747.22852: no more pending results, returning what we have 44071 1727204747.22856: results queue empty 44071 1727204747.22857: checking for any_errors_fatal 44071 1727204747.22867: done checking for any_errors_fatal 44071 1727204747.22868: checking for max_fail_percentage 44071 1727204747.22870: done checking for max_fail_percentage 44071 1727204747.22871: checking to see if all hosts have failed and the running result is not ok 44071 1727204747.22871: done checking to see if all hosts have failed 44071 1727204747.22872: getting the remaining hosts for this loop 44071 1727204747.22874: done getting the remaining hosts for this loop 44071 1727204747.22879: getting the next task for host managed-node2 44071 1727204747.22890: done getting next task for host managed-node2 44071 1727204747.22895: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204747.22900: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204747.22936: getting variables 44071 1727204747.22938: in VariableManager get_vars() 44071 1727204747.22992: Calling all_inventory to load vars for managed-node2 44071 1727204747.22996: Calling groups_inventory to load vars for managed-node2 44071 1727204747.22998: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204747.23004: done sending task result for task 127b8e07-fff9-c964-7471-000000002698 44071 1727204747.23007: WORKER PROCESS EXITING 44071 1727204747.23017: Calling all_plugins_play to load vars for managed-node2 44071 1727204747.23020: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204747.23022: Calling groups_plugins_play to load vars for managed-node2 44071 1727204747.24107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204747.25362: done with get_vars() 44071 1727204747.25399: done getting variables 44071 1727204747.25453: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:05:47 -0400 (0:00:00.064) 0:02:39.571 ***** 44071 1727204747.25484: entering _queue_task() for managed-node2/dnf 44071 1727204747.25802: worker is 1 (out of 1 available) 44071 1727204747.25819: exiting _queue_task() for managed-node2/dnf 44071 1727204747.25838: done queuing things up, now waiting for results queue to drain 44071 1727204747.25840: waiting for pending results... 44071 1727204747.26055: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 44071 1727204747.26169: in run() - task 127b8e07-fff9-c964-7471-000000002699 44071 1727204747.26183: variable 'ansible_search_path' from source: unknown 44071 1727204747.26188: variable 'ansible_search_path' from source: unknown 44071 1727204747.26222: calling self._execute() 44071 1727204747.26314: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204747.26320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204747.26331: variable 'omit' from source: magic vars 44071 1727204747.26660: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.26672: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204747.26836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204747.28968: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204747.29023: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204747.29053: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204747.29082: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204747.29105: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204747.29174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.29195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.29217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.29251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.29262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.29357: variable 'ansible_distribution' from source: facts 44071 1727204747.29361: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.29370: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 44071 1727204747.29464: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204747.29562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.29583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.29601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.29630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.29643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.29681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.29699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.29717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.29746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.29757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.29794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.29810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.29828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.29858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.29872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.29992: variable 'network_connections' from source: include params 44071 1727204747.30006: variable 'interface' from source: play vars 44071 1727204747.30058: variable 'interface' from source: play vars 44071 1727204747.30170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204747.30281: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204747.30312: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204747.30341: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204747.30365: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204747.30404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204747.30423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204747.30450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.30471: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204747.30513: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204747.30698: variable 'network_connections' from source: include params 44071 1727204747.30704: variable 'interface' from source: play vars 44071 1727204747.30754: variable 'interface' from source: play vars 44071 1727204747.30778: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204747.30782: when evaluation is False, skipping this task 44071 1727204747.30784: _execute() done 44071 1727204747.30787: dumping result to json 44071 1727204747.30791: done dumping result, returning 44071 1727204747.30799: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-000000002699] 44071 1727204747.30804: sending task result for task 127b8e07-fff9-c964-7471-000000002699 44071 1727204747.30915: done sending task result for task 127b8e07-fff9-c964-7471-000000002699 44071 1727204747.30918: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204747.30976: no more pending results, returning what we have 44071 1727204747.30980: results queue empty 44071 1727204747.30981: checking for any_errors_fatal 44071 1727204747.30988: done checking for any_errors_fatal 44071 1727204747.30990: checking for max_fail_percentage 44071 1727204747.30992: done checking for max_fail_percentage 44071 1727204747.30993: checking to see if all hosts have failed and the running result is not ok 44071 1727204747.30994: done checking to see if all hosts have failed 44071 1727204747.30994: getting the remaining hosts for this loop 44071 1727204747.30996: done getting the remaining hosts for this loop 44071 1727204747.31001: getting the next task for host managed-node2 44071 1727204747.31010: done getting next task for host managed-node2 44071 1727204747.31016: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204747.31021: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204747.31053: getting variables 44071 1727204747.31055: in VariableManager get_vars() 44071 1727204747.31113: Calling all_inventory to load vars for managed-node2 44071 1727204747.31115: Calling groups_inventory to load vars for managed-node2 44071 1727204747.31117: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204747.31128: Calling all_plugins_play to load vars for managed-node2 44071 1727204747.31131: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204747.31134: Calling groups_plugins_play to load vars for managed-node2 44071 1727204747.32404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204747.33621: done with get_vars() 44071 1727204747.33654: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 44071 1727204747.33720: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:05:47 -0400 (0:00:00.082) 0:02:39.653 ***** 44071 1727204747.33747: entering _queue_task() for managed-node2/yum 44071 1727204747.34053: worker is 1 (out of 1 available) 44071 1727204747.34070: exiting _queue_task() for managed-node2/yum 44071 1727204747.34085: done queuing things up, now waiting for results queue to drain 44071 1727204747.34087: waiting for pending results... 44071 1727204747.34305: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 44071 1727204747.34424: in run() - task 127b8e07-fff9-c964-7471-00000000269a 44071 1727204747.34441: variable 'ansible_search_path' from source: unknown 44071 1727204747.34445: variable 'ansible_search_path' from source: unknown 44071 1727204747.34479: calling self._execute() 44071 1727204747.34578: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204747.34582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204747.34593: variable 'omit' from source: magic vars 44071 1727204747.34929: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.34942: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204747.35090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204747.36934: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204747.36995: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204747.37025: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204747.37056: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204747.37081: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204747.37149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.37185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.37205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.37237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.37249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.37332: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.37350: Evaluated conditional (ansible_distribution_major_version | int < 8): False 44071 1727204747.37353: when evaluation is False, skipping this task 44071 1727204747.37356: _execute() done 44071 1727204747.37358: dumping result to json 44071 1727204747.37361: done dumping result, returning 44071 1727204747.37371: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-c964-7471-00000000269a] 44071 1727204747.37374: sending task result for task 127b8e07-fff9-c964-7471-00000000269a 44071 1727204747.37485: done sending task result for task 127b8e07-fff9-c964-7471-00000000269a 44071 1727204747.37488: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 44071 1727204747.37550: no more pending results, returning what we have 44071 1727204747.37554: results queue empty 44071 1727204747.37555: checking for any_errors_fatal 44071 1727204747.37563: done checking for any_errors_fatal 44071 1727204747.37563: checking for max_fail_percentage 44071 1727204747.37567: done checking for max_fail_percentage 44071 1727204747.37568: checking to see if all hosts have failed and the running result is not ok 44071 1727204747.37568: done checking to see if all hosts have failed 44071 1727204747.37569: getting the remaining hosts for this loop 44071 1727204747.37571: done getting the remaining hosts for this loop 44071 1727204747.37576: getting the next task for host managed-node2 44071 1727204747.37586: done getting next task for host managed-node2 44071 1727204747.37590: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204747.37596: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204747.37631: getting variables 44071 1727204747.37633: in VariableManager get_vars() 44071 1727204747.37689: Calling all_inventory to load vars for managed-node2 44071 1727204747.37692: Calling groups_inventory to load vars for managed-node2 44071 1727204747.37694: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204747.37705: Calling all_plugins_play to load vars for managed-node2 44071 1727204747.37708: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204747.37711: Calling groups_plugins_play to load vars for managed-node2 44071 1727204747.38797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204747.40168: done with get_vars() 44071 1727204747.40193: done getting variables 44071 1727204747.40244: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:05:47 -0400 (0:00:00.065) 0:02:39.719 ***** 44071 1727204747.40276: entering _queue_task() for managed-node2/fail 44071 1727204747.40902: worker is 1 (out of 1 available) 44071 1727204747.40914: exiting _queue_task() for managed-node2/fail 44071 1727204747.40927: done queuing things up, now waiting for results queue to drain 44071 1727204747.40928: waiting for pending results... 44071 1727204747.41063: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 44071 1727204747.41266: in run() - task 127b8e07-fff9-c964-7471-00000000269b 44071 1727204747.41271: variable 'ansible_search_path' from source: unknown 44071 1727204747.41274: variable 'ansible_search_path' from source: unknown 44071 1727204747.41297: calling self._execute() 44071 1727204747.41426: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204747.41440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204747.41455: variable 'omit' from source: magic vars 44071 1727204747.41896: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.41907: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204747.42010: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204747.42167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204747.44024: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204747.44273: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204747.44277: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204747.44279: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204747.44281: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204747.44305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.44350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.44383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.44434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.44454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.44526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.44557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.44588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.44631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.44649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.44695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.44724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.44754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.44803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.44824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.45015: variable 'network_connections' from source: include params 44071 1727204747.45034: variable 'interface' from source: play vars 44071 1727204747.45116: variable 'interface' from source: play vars 44071 1727204747.45202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204747.45391: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204747.45435: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204747.45496: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204747.45532: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204747.45609: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204747.45654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204747.45729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.45918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204747.45921: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204747.46279: variable 'network_connections' from source: include params 44071 1727204747.46293: variable 'interface' from source: play vars 44071 1727204747.46380: variable 'interface' from source: play vars 44071 1727204747.46434: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204747.46443: when evaluation is False, skipping this task 44071 1727204747.46450: _execute() done 44071 1727204747.46460: dumping result to json 44071 1727204747.46470: done dumping result, returning 44071 1727204747.46482: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-00000000269b] 44071 1727204747.46491: sending task result for task 127b8e07-fff9-c964-7471-00000000269b skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204747.46683: no more pending results, returning what we have 44071 1727204747.46686: results queue empty 44071 1727204747.46687: checking for any_errors_fatal 44071 1727204747.46698: done checking for any_errors_fatal 44071 1727204747.46698: checking for max_fail_percentage 44071 1727204747.46700: done checking for max_fail_percentage 44071 1727204747.46701: checking to see if all hosts have failed and the running result is not ok 44071 1727204747.46702: done checking to see if all hosts have failed 44071 1727204747.46703: getting the remaining hosts for this loop 44071 1727204747.46704: done getting the remaining hosts for this loop 44071 1727204747.46709: getting the next task for host managed-node2 44071 1727204747.46719: done getting next task for host managed-node2 44071 1727204747.46725: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 44071 1727204747.46732: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204747.46768: getting variables 44071 1727204747.46770: in VariableManager get_vars() 44071 1727204747.46826: Calling all_inventory to load vars for managed-node2 44071 1727204747.46831: Calling groups_inventory to load vars for managed-node2 44071 1727204747.46833: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204747.46844: Calling all_plugins_play to load vars for managed-node2 44071 1727204747.46847: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204747.46850: Calling groups_plugins_play to load vars for managed-node2 44071 1727204747.47436: done sending task result for task 127b8e07-fff9-c964-7471-00000000269b 44071 1727204747.47441: WORKER PROCESS EXITING 44071 1727204747.48760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204747.50949: done with get_vars() 44071 1727204747.50998: done getting variables 44071 1727204747.51072: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:05:47 -0400 (0:00:00.108) 0:02:39.827 ***** 44071 1727204747.51116: entering _queue_task() for managed-node2/package 44071 1727204747.51544: worker is 1 (out of 1 available) 44071 1727204747.51558: exiting _queue_task() for managed-node2/package 44071 1727204747.51575: done queuing things up, now waiting for results queue to drain 44071 1727204747.51577: waiting for pending results... 44071 1727204747.51913: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 44071 1727204747.52089: in run() - task 127b8e07-fff9-c964-7471-00000000269c 44071 1727204747.52113: variable 'ansible_search_path' from source: unknown 44071 1727204747.52121: variable 'ansible_search_path' from source: unknown 44071 1727204747.52176: calling self._execute() 44071 1727204747.52297: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204747.52309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204747.52320: variable 'omit' from source: magic vars 44071 1727204747.52762: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.52783: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204747.53016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204747.53343: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204747.53400: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204747.53440: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204747.53531: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204747.53671: variable 'network_packages' from source: role '' defaults 44071 1727204747.53795: variable '__network_provider_setup' from source: role '' defaults 44071 1727204747.53812: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204747.53887: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204747.53970: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204747.53973: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204747.54195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204747.57008: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204747.57089: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204747.57132: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204747.57173: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204747.57201: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204747.57296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.57326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.57359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.57416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.57472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.57498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.57530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.57563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.57618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.57639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.57970: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204747.58052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.58084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.58117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.58172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.58193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.58308: variable 'ansible_python' from source: facts 44071 1727204747.58335: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204747.58438: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204747.58536: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204747.58771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.58774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.58779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.58805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.58826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.58886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.58929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.58960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.59013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.59270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.59274: variable 'network_connections' from source: include params 44071 1727204747.59277: variable 'interface' from source: play vars 44071 1727204747.59330: variable 'interface' from source: play vars 44071 1727204747.59423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204747.59458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204747.59501: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.59539: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204747.59597: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204747.59948: variable 'network_connections' from source: include params 44071 1727204747.59960: variable 'interface' from source: play vars 44071 1727204747.60085: variable 'interface' from source: play vars 44071 1727204747.60124: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204747.60223: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204747.60616: variable 'network_connections' from source: include params 44071 1727204747.60627: variable 'interface' from source: play vars 44071 1727204747.60704: variable 'interface' from source: play vars 44071 1727204747.60734: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204747.60831: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204747.61193: variable 'network_connections' from source: include params 44071 1727204747.61204: variable 'interface' from source: play vars 44071 1727204747.61283: variable 'interface' from source: play vars 44071 1727204747.61347: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204747.61423: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204747.61437: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204747.61509: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204747.61768: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204747.62305: variable 'network_connections' from source: include params 44071 1727204747.62316: variable 'interface' from source: play vars 44071 1727204747.62393: variable 'interface' from source: play vars 44071 1727204747.62442: variable 'ansible_distribution' from source: facts 44071 1727204747.62446: variable '__network_rh_distros' from source: role '' defaults 44071 1727204747.62448: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.62450: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204747.62639: variable 'ansible_distribution' from source: facts 44071 1727204747.62648: variable '__network_rh_distros' from source: role '' defaults 44071 1727204747.62662: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.62675: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204747.62872: variable 'ansible_distribution' from source: facts 44071 1727204747.62875: variable '__network_rh_distros' from source: role '' defaults 44071 1727204747.62881: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.62983: variable 'network_provider' from source: set_fact 44071 1727204747.62987: variable 'ansible_facts' from source: unknown 44071 1727204747.63860: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 44071 1727204747.63872: when evaluation is False, skipping this task 44071 1727204747.63879: _execute() done 44071 1727204747.63886: dumping result to json 44071 1727204747.63894: done dumping result, returning 44071 1727204747.63908: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-c964-7471-00000000269c] 44071 1727204747.63917: sending task result for task 127b8e07-fff9-c964-7471-00000000269c skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 44071 1727204747.64116: no more pending results, returning what we have 44071 1727204747.64120: results queue empty 44071 1727204747.64121: checking for any_errors_fatal 44071 1727204747.64130: done checking for any_errors_fatal 44071 1727204747.64131: checking for max_fail_percentage 44071 1727204747.64132: done checking for max_fail_percentage 44071 1727204747.64133: checking to see if all hosts have failed and the running result is not ok 44071 1727204747.64134: done checking to see if all hosts have failed 44071 1727204747.64135: getting the remaining hosts for this loop 44071 1727204747.64137: done getting the remaining hosts for this loop 44071 1727204747.64142: getting the next task for host managed-node2 44071 1727204747.64153: done getting next task for host managed-node2 44071 1727204747.64158: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204747.64163: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204747.64200: getting variables 44071 1727204747.64202: in VariableManager get_vars() 44071 1727204747.64252: Calling all_inventory to load vars for managed-node2 44071 1727204747.64255: Calling groups_inventory to load vars for managed-node2 44071 1727204747.64263: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204747.64581: Calling all_plugins_play to load vars for managed-node2 44071 1727204747.64586: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204747.64589: Calling groups_plugins_play to load vars for managed-node2 44071 1727204747.65285: done sending task result for task 127b8e07-fff9-c964-7471-00000000269c 44071 1727204747.65290: WORKER PROCESS EXITING 44071 1727204747.66603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204747.68759: done with get_vars() 44071 1727204747.68808: done getting variables 44071 1727204747.68883: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:05:47 -0400 (0:00:00.178) 0:02:40.005 ***** 44071 1727204747.68925: entering _queue_task() for managed-node2/package 44071 1727204747.69347: worker is 1 (out of 1 available) 44071 1727204747.69361: exiting _queue_task() for managed-node2/package 44071 1727204747.69480: done queuing things up, now waiting for results queue to drain 44071 1727204747.69483: waiting for pending results... 44071 1727204747.69726: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 44071 1727204747.69925: in run() - task 127b8e07-fff9-c964-7471-00000000269d 44071 1727204747.69950: variable 'ansible_search_path' from source: unknown 44071 1727204747.69962: variable 'ansible_search_path' from source: unknown 44071 1727204747.70013: calling self._execute() 44071 1727204747.70131: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204747.70145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204747.70161: variable 'omit' from source: magic vars 44071 1727204747.70596: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.70621: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204747.70758: variable 'network_state' from source: role '' defaults 44071 1727204747.70778: Evaluated conditional (network_state != {}): False 44071 1727204747.70787: when evaluation is False, skipping this task 44071 1727204747.70795: _execute() done 44071 1727204747.70803: dumping result to json 44071 1727204747.70812: done dumping result, returning 44071 1727204747.70830: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-c964-7471-00000000269d] 44071 1727204747.70842: sending task result for task 127b8e07-fff9-c964-7471-00000000269d 44071 1727204747.71077: done sending task result for task 127b8e07-fff9-c964-7471-00000000269d 44071 1727204747.71080: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204747.71137: no more pending results, returning what we have 44071 1727204747.71142: results queue empty 44071 1727204747.71143: checking for any_errors_fatal 44071 1727204747.71151: done checking for any_errors_fatal 44071 1727204747.71152: checking for max_fail_percentage 44071 1727204747.71154: done checking for max_fail_percentage 44071 1727204747.71156: checking to see if all hosts have failed and the running result is not ok 44071 1727204747.71156: done checking to see if all hosts have failed 44071 1727204747.71157: getting the remaining hosts for this loop 44071 1727204747.71159: done getting the remaining hosts for this loop 44071 1727204747.71164: getting the next task for host managed-node2 44071 1727204747.71180: done getting next task for host managed-node2 44071 1727204747.71184: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204747.71191: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204747.71233: getting variables 44071 1727204747.71235: in VariableManager get_vars() 44071 1727204747.71400: Calling all_inventory to load vars for managed-node2 44071 1727204747.71403: Calling groups_inventory to load vars for managed-node2 44071 1727204747.71406: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204747.71421: Calling all_plugins_play to load vars for managed-node2 44071 1727204747.71425: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204747.71428: Calling groups_plugins_play to load vars for managed-node2 44071 1727204747.73405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204747.75779: done with get_vars() 44071 1727204747.75824: done getting variables 44071 1727204747.75896: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:05:47 -0400 (0:00:00.070) 0:02:40.075 ***** 44071 1727204747.75936: entering _queue_task() for managed-node2/package 44071 1727204747.76349: worker is 1 (out of 1 available) 44071 1727204747.76363: exiting _queue_task() for managed-node2/package 44071 1727204747.76578: done queuing things up, now waiting for results queue to drain 44071 1727204747.76581: waiting for pending results... 44071 1727204747.76713: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 44071 1727204747.76876: in run() - task 127b8e07-fff9-c964-7471-00000000269e 44071 1727204747.76900: variable 'ansible_search_path' from source: unknown 44071 1727204747.76913: variable 'ansible_search_path' from source: unknown 44071 1727204747.77022: calling self._execute() 44071 1727204747.77089: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204747.77102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204747.77117: variable 'omit' from source: magic vars 44071 1727204747.77566: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.77591: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204747.77736: variable 'network_state' from source: role '' defaults 44071 1727204747.77754: Evaluated conditional (network_state != {}): False 44071 1727204747.77763: when evaluation is False, skipping this task 44071 1727204747.77775: _execute() done 44071 1727204747.77973: dumping result to json 44071 1727204747.77976: done dumping result, returning 44071 1727204747.77980: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-c964-7471-00000000269e] 44071 1727204747.77983: sending task result for task 127b8e07-fff9-c964-7471-00000000269e 44071 1727204747.78078: done sending task result for task 127b8e07-fff9-c964-7471-00000000269e 44071 1727204747.78082: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204747.78139: no more pending results, returning what we have 44071 1727204747.78144: results queue empty 44071 1727204747.78145: checking for any_errors_fatal 44071 1727204747.78154: done checking for any_errors_fatal 44071 1727204747.78156: checking for max_fail_percentage 44071 1727204747.78157: done checking for max_fail_percentage 44071 1727204747.78159: checking to see if all hosts have failed and the running result is not ok 44071 1727204747.78159: done checking to see if all hosts have failed 44071 1727204747.78160: getting the remaining hosts for this loop 44071 1727204747.78162: done getting the remaining hosts for this loop 44071 1727204747.78169: getting the next task for host managed-node2 44071 1727204747.78181: done getting next task for host managed-node2 44071 1727204747.78186: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204747.78193: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204747.78236: getting variables 44071 1727204747.78239: in VariableManager get_vars() 44071 1727204747.78497: Calling all_inventory to load vars for managed-node2 44071 1727204747.78500: Calling groups_inventory to load vars for managed-node2 44071 1727204747.78503: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204747.78515: Calling all_plugins_play to load vars for managed-node2 44071 1727204747.78518: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204747.78521: Calling groups_plugins_play to load vars for managed-node2 44071 1727204747.80331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204747.82529: done with get_vars() 44071 1727204747.82579: done getting variables 44071 1727204747.82645: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:05:47 -0400 (0:00:00.067) 0:02:40.143 ***** 44071 1727204747.82689: entering _queue_task() for managed-node2/service 44071 1727204747.83097: worker is 1 (out of 1 available) 44071 1727204747.83113: exiting _queue_task() for managed-node2/service 44071 1727204747.83128: done queuing things up, now waiting for results queue to drain 44071 1727204747.83130: waiting for pending results... 44071 1727204747.83359: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 44071 1727204747.83507: in run() - task 127b8e07-fff9-c964-7471-00000000269f 44071 1727204747.83513: variable 'ansible_search_path' from source: unknown 44071 1727204747.83516: variable 'ansible_search_path' from source: unknown 44071 1727204747.83535: calling self._execute() 44071 1727204747.83627: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204747.83634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204747.83641: variable 'omit' from source: magic vars 44071 1727204747.83961: variable 'ansible_distribution_major_version' from source: facts 44071 1727204747.83974: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204747.84070: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204747.84220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204747.86769: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204747.86826: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204747.86856: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204747.86884: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204747.86907: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204747.86977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.86999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.87022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.87053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.87064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.87103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.87123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.87145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.87174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.87185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.87217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204747.87238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204747.87258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.87286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204747.87298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204747.87427: variable 'network_connections' from source: include params 44071 1727204747.87440: variable 'interface' from source: play vars 44071 1727204747.87498: variable 'interface' from source: play vars 44071 1727204747.87556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204747.95660: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204747.95699: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204747.95722: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204747.95746: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204747.95796: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204747.95812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204747.95831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204747.95857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204747.95897: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204747.96089: variable 'network_connections' from source: include params 44071 1727204747.96095: variable 'interface' from source: play vars 44071 1727204747.96149: variable 'interface' from source: play vars 44071 1727204747.96174: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 44071 1727204747.96179: when evaluation is False, skipping this task 44071 1727204747.96182: _execute() done 44071 1727204747.96184: dumping result to json 44071 1727204747.96186: done dumping result, returning 44071 1727204747.96190: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-c964-7471-00000000269f] 44071 1727204747.96192: sending task result for task 127b8e07-fff9-c964-7471-00000000269f 44071 1727204747.96292: done sending task result for task 127b8e07-fff9-c964-7471-00000000269f 44071 1727204747.96304: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 44071 1727204747.96358: no more pending results, returning what we have 44071 1727204747.96361: results queue empty 44071 1727204747.96362: checking for any_errors_fatal 44071 1727204747.96376: done checking for any_errors_fatal 44071 1727204747.96377: checking for max_fail_percentage 44071 1727204747.96379: done checking for max_fail_percentage 44071 1727204747.96380: checking to see if all hosts have failed and the running result is not ok 44071 1727204747.96380: done checking to see if all hosts have failed 44071 1727204747.96381: getting the remaining hosts for this loop 44071 1727204747.96383: done getting the remaining hosts for this loop 44071 1727204747.96387: getting the next task for host managed-node2 44071 1727204747.96395: done getting next task for host managed-node2 44071 1727204747.96399: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204747.96404: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204747.96438: getting variables 44071 1727204747.96440: in VariableManager get_vars() 44071 1727204747.96488: Calling all_inventory to load vars for managed-node2 44071 1727204747.96491: Calling groups_inventory to load vars for managed-node2 44071 1727204747.96493: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204747.96503: Calling all_plugins_play to load vars for managed-node2 44071 1727204747.96506: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204747.96509: Calling groups_plugins_play to load vars for managed-node2 44071 1727204748.04699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204748.05896: done with get_vars() 44071 1727204748.05932: done getting variables 44071 1727204748.05977: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:05:48 -0400 (0:00:00.233) 0:02:40.376 ***** 44071 1727204748.06001: entering _queue_task() for managed-node2/service 44071 1727204748.06326: worker is 1 (out of 1 available) 44071 1727204748.06340: exiting _queue_task() for managed-node2/service 44071 1727204748.06356: done queuing things up, now waiting for results queue to drain 44071 1727204748.06360: waiting for pending results... 44071 1727204748.06581: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 44071 1727204748.06719: in run() - task 127b8e07-fff9-c964-7471-0000000026a0 44071 1727204748.06733: variable 'ansible_search_path' from source: unknown 44071 1727204748.06737: variable 'ansible_search_path' from source: unknown 44071 1727204748.06775: calling self._execute() 44071 1727204748.06867: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204748.06872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204748.06882: variable 'omit' from source: magic vars 44071 1727204748.07225: variable 'ansible_distribution_major_version' from source: facts 44071 1727204748.07240: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204748.07374: variable 'network_provider' from source: set_fact 44071 1727204748.07379: variable 'network_state' from source: role '' defaults 44071 1727204748.07391: Evaluated conditional (network_provider == "nm" or network_state != {}): True 44071 1727204748.07396: variable 'omit' from source: magic vars 44071 1727204748.07447: variable 'omit' from source: magic vars 44071 1727204748.07474: variable 'network_service_name' from source: role '' defaults 44071 1727204748.07524: variable 'network_service_name' from source: role '' defaults 44071 1727204748.07605: variable '__network_provider_setup' from source: role '' defaults 44071 1727204748.07609: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204748.07658: variable '__network_service_name_default_nm' from source: role '' defaults 44071 1727204748.07667: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204748.07718: variable '__network_packages_default_nm' from source: role '' defaults 44071 1727204748.07889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204748.09640: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204748.09700: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204748.09734: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204748.09764: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204748.09787: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204748.09852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204748.09880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204748.09899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204748.09927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204748.09939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204748.09981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204748.09999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204748.10017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204748.10046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204748.10057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204748.10240: variable '__network_packages_default_gobject_packages' from source: role '' defaults 44071 1727204748.10335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204748.10352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204748.10372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204748.10399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204748.10415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204748.10486: variable 'ansible_python' from source: facts 44071 1727204748.10501: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 44071 1727204748.10570: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204748.10628: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204748.10728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204748.10752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204748.10776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204748.10804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204748.10816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204748.10856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204748.10895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204748.10913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204748.10943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204748.10953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204748.11060: variable 'network_connections' from source: include params 44071 1727204748.11071: variable 'interface' from source: play vars 44071 1727204748.11128: variable 'interface' from source: play vars 44071 1727204748.11217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204748.11373: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204748.11414: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204748.11451: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204748.11485: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204748.11541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204748.11563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204748.11589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204748.11615: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204748.11662: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204748.11877: variable 'network_connections' from source: include params 44071 1727204748.11883: variable 'interface' from source: play vars 44071 1727204748.11946: variable 'interface' from source: play vars 44071 1727204748.11975: variable '__network_packages_default_wireless' from source: role '' defaults 44071 1727204748.12036: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204748.12248: variable 'network_connections' from source: include params 44071 1727204748.12251: variable 'interface' from source: play vars 44071 1727204748.12308: variable 'interface' from source: play vars 44071 1727204748.12326: variable '__network_packages_default_team' from source: role '' defaults 44071 1727204748.12388: variable '__network_team_connections_defined' from source: role '' defaults 44071 1727204748.12596: variable 'network_connections' from source: include params 44071 1727204748.12600: variable 'interface' from source: play vars 44071 1727204748.12657: variable 'interface' from source: play vars 44071 1727204748.12697: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204748.12745: variable '__network_service_name_default_initscripts' from source: role '' defaults 44071 1727204748.12751: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204748.12798: variable '__network_packages_default_initscripts' from source: role '' defaults 44071 1727204748.12952: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 44071 1727204748.13315: variable 'network_connections' from source: include params 44071 1727204748.13319: variable 'interface' from source: play vars 44071 1727204748.13365: variable 'interface' from source: play vars 44071 1727204748.13374: variable 'ansible_distribution' from source: facts 44071 1727204748.13377: variable '__network_rh_distros' from source: role '' defaults 44071 1727204748.13384: variable 'ansible_distribution_major_version' from source: facts 44071 1727204748.13400: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 44071 1727204748.13524: variable 'ansible_distribution' from source: facts 44071 1727204748.13527: variable '__network_rh_distros' from source: role '' defaults 44071 1727204748.13532: variable 'ansible_distribution_major_version' from source: facts 44071 1727204748.13538: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 44071 1727204748.13661: variable 'ansible_distribution' from source: facts 44071 1727204748.13664: variable '__network_rh_distros' from source: role '' defaults 44071 1727204748.13670: variable 'ansible_distribution_major_version' from source: facts 44071 1727204748.13697: variable 'network_provider' from source: set_fact 44071 1727204748.13719: variable 'omit' from source: magic vars 44071 1727204748.13744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204748.13768: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204748.13784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204748.13799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204748.13809: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204748.13837: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204748.13841: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204748.13843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204748.13916: Set connection var ansible_connection to ssh 44071 1727204748.13921: Set connection var ansible_timeout to 10 44071 1727204748.13939: Set connection var ansible_pipelining to False 44071 1727204748.13942: Set connection var ansible_shell_type to sh 44071 1727204748.13945: Set connection var ansible_shell_executable to /bin/sh 44071 1727204748.13947: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204748.13967: variable 'ansible_shell_executable' from source: unknown 44071 1727204748.13970: variable 'ansible_connection' from source: unknown 44071 1727204748.13972: variable 'ansible_module_compression' from source: unknown 44071 1727204748.13975: variable 'ansible_shell_type' from source: unknown 44071 1727204748.13977: variable 'ansible_shell_executable' from source: unknown 44071 1727204748.13981: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204748.13985: variable 'ansible_pipelining' from source: unknown 44071 1727204748.13988: variable 'ansible_timeout' from source: unknown 44071 1727204748.13992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204748.14078: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204748.14089: variable 'omit' from source: magic vars 44071 1727204748.14095: starting attempt loop 44071 1727204748.14098: running the handler 44071 1727204748.14159: variable 'ansible_facts' from source: unknown 44071 1727204748.15425: _low_level_execute_command(): starting 44071 1727204748.15467: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204748.16336: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204748.16365: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204748.16447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204748.16513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204748.16601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204748.18360: stdout chunk (state=3): >>>/root <<< 44071 1727204748.18473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204748.18535: stderr chunk (state=3): >>><<< 44071 1727204748.18539: stdout chunk (state=3): >>><<< 44071 1727204748.18564: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204748.18578: _low_level_execute_command(): starting 44071 1727204748.18585: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204748.1856377-53224-57585346771983 `" && echo ansible-tmp-1727204748.1856377-53224-57585346771983="` echo /root/.ansible/tmp/ansible-tmp-1727204748.1856377-53224-57585346771983 `" ) && sleep 0' 44071 1727204748.19075: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204748.19080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204748.19099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204748.19150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204748.19154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204748.19156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204748.19235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204748.21218: stdout chunk (state=3): >>>ansible-tmp-1727204748.1856377-53224-57585346771983=/root/.ansible/tmp/ansible-tmp-1727204748.1856377-53224-57585346771983 <<< 44071 1727204748.21330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204748.21398: stderr chunk (state=3): >>><<< 44071 1727204748.21401: stdout chunk (state=3): >>><<< 44071 1727204748.21416: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204748.1856377-53224-57585346771983=/root/.ansible/tmp/ansible-tmp-1727204748.1856377-53224-57585346771983 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204748.21447: variable 'ansible_module_compression' from source: unknown 44071 1727204748.21497: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 44071 1727204748.21558: variable 'ansible_facts' from source: unknown 44071 1727204748.21701: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204748.1856377-53224-57585346771983/AnsiballZ_systemd.py 44071 1727204748.21826: Sending initial data 44071 1727204748.21830: Sent initial data (155 bytes) 44071 1727204748.22353: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204748.22359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204748.22418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204748.22421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204748.22429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204748.22511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204748.24118: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204748.24188: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204748.24257: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpi5w6oxt5 /root/.ansible/tmp/ansible-tmp-1727204748.1856377-53224-57585346771983/AnsiballZ_systemd.py <<< 44071 1727204748.24266: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204748.1856377-53224-57585346771983/AnsiballZ_systemd.py" <<< 44071 1727204748.24332: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpi5w6oxt5" to remote "/root/.ansible/tmp/ansible-tmp-1727204748.1856377-53224-57585346771983/AnsiballZ_systemd.py" <<< 44071 1727204748.24334: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204748.1856377-53224-57585346771983/AnsiballZ_systemd.py" <<< 44071 1727204748.25588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204748.25668: stderr chunk (state=3): >>><<< 44071 1727204748.25672: stdout chunk (state=3): >>><<< 44071 1727204748.25691: done transferring module to remote 44071 1727204748.25706: _low_level_execute_command(): starting 44071 1727204748.25711: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204748.1856377-53224-57585346771983/ /root/.ansible/tmp/ansible-tmp-1727204748.1856377-53224-57585346771983/AnsiballZ_systemd.py && sleep 0' 44071 1727204748.26208: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204748.26212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204748.26215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204748.26217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204748.26219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204748.26274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204748.26282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204748.26357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204748.28193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204748.28254: stderr chunk (state=3): >>><<< 44071 1727204748.28258: stdout chunk (state=3): >>><<< 44071 1727204748.28273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204748.28276: _low_level_execute_command(): starting 44071 1727204748.28282: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204748.1856377-53224-57585346771983/AnsiballZ_systemd.py && sleep 0' 44071 1727204748.28761: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204748.28768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204748.28794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204748.28797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204748.28847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204748.28859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204748.28941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204748.60632: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4530176", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3530723328", "CPUUsageNSec": "1754832000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitC<<< 44071 1727204748.60646: stdout chunk (state=3): >>>ORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext":<<< 44071 1727204748.60657: stdout chunk (state=3): >>> "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 44071 1727204748.62628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204748.62692: stderr chunk (state=3): >>><<< 44071 1727204748.62696: stdout chunk (state=3): >>><<< 44071 1727204748.62714: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4530176", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3530723328", "CPUUsageNSec": "1754832000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service shutdown.target cloud-init.service multi-user.target network.target", "After": "basic.target dbus-broker.service sysinit.target systemd-journald.socket dbus.socket system.slice cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:15 EDT", "StateChangeTimestampMonotonic": "822258182", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204748.62862: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204748.1856377-53224-57585346771983/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204748.62880: _low_level_execute_command(): starting 44071 1727204748.62886: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204748.1856377-53224-57585346771983/ > /dev/null 2>&1 && sleep 0' 44071 1727204748.63372: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204748.63377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204748.63401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204748.63405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204748.63472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204748.63475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204748.63477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204748.63557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204748.65505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204748.65569: stderr chunk (state=3): >>><<< 44071 1727204748.65573: stdout chunk (state=3): >>><<< 44071 1727204748.65591: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204748.65598: handler run complete 44071 1727204748.65646: attempt loop complete, returning result 44071 1727204748.65649: _execute() done 44071 1727204748.65652: dumping result to json 44071 1727204748.65669: done dumping result, returning 44071 1727204748.65679: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-c964-7471-0000000026a0] 44071 1727204748.65682: sending task result for task 127b8e07-fff9-c964-7471-0000000026a0 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204748.66037: no more pending results, returning what we have 44071 1727204748.66040: results queue empty 44071 1727204748.66041: checking for any_errors_fatal 44071 1727204748.66049: done checking for any_errors_fatal 44071 1727204748.66050: checking for max_fail_percentage 44071 1727204748.66051: done checking for max_fail_percentage 44071 1727204748.66052: checking to see if all hosts have failed and the running result is not ok 44071 1727204748.66053: done checking to see if all hosts have failed 44071 1727204748.66054: getting the remaining hosts for this loop 44071 1727204748.66055: done getting the remaining hosts for this loop 44071 1727204748.66059: getting the next task for host managed-node2 44071 1727204748.66069: done getting next task for host managed-node2 44071 1727204748.66073: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204748.66078: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204748.66091: getting variables 44071 1727204748.66093: in VariableManager get_vars() 44071 1727204748.66135: Calling all_inventory to load vars for managed-node2 44071 1727204748.66138: Calling groups_inventory to load vars for managed-node2 44071 1727204748.66140: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204748.66150: Calling all_plugins_play to load vars for managed-node2 44071 1727204748.66152: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204748.66154: Calling groups_plugins_play to load vars for managed-node2 44071 1727204748.66685: done sending task result for task 127b8e07-fff9-c964-7471-0000000026a0 44071 1727204748.66689: WORKER PROCESS EXITING 44071 1727204748.67357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204748.68608: done with get_vars() 44071 1727204748.68641: done getting variables 44071 1727204748.68697: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:05:48 -0400 (0:00:00.627) 0:02:41.003 ***** 44071 1727204748.68738: entering _queue_task() for managed-node2/service 44071 1727204748.69057: worker is 1 (out of 1 available) 44071 1727204748.69076: exiting _queue_task() for managed-node2/service 44071 1727204748.69092: done queuing things up, now waiting for results queue to drain 44071 1727204748.69094: waiting for pending results... 44071 1727204748.69316: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 44071 1727204748.69456: in run() - task 127b8e07-fff9-c964-7471-0000000026a1 44071 1727204748.69472: variable 'ansible_search_path' from source: unknown 44071 1727204748.69475: variable 'ansible_search_path' from source: unknown 44071 1727204748.69513: calling self._execute() 44071 1727204748.69618: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204748.69622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204748.69635: variable 'omit' from source: magic vars 44071 1727204748.69974: variable 'ansible_distribution_major_version' from source: facts 44071 1727204748.69985: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204748.70083: variable 'network_provider' from source: set_fact 44071 1727204748.70088: Evaluated conditional (network_provider == "nm"): True 44071 1727204748.70163: variable '__network_wpa_supplicant_required' from source: role '' defaults 44071 1727204748.70314: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 44071 1727204748.70374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204748.72083: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204748.72138: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204748.72168: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204748.72205: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204748.72228: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204748.72325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204748.72349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204748.72370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204748.72403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204748.72415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204748.72456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204748.72477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204748.72498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204748.72526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204748.72540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204748.72576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204748.72594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204748.72616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204748.72645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204748.72655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204748.72778: variable 'network_connections' from source: include params 44071 1727204748.72790: variable 'interface' from source: play vars 44071 1727204748.72855: variable 'interface' from source: play vars 44071 1727204748.72915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 44071 1727204748.73052: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 44071 1727204748.73085: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 44071 1727204748.73109: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 44071 1727204748.73132: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 44071 1727204748.73173: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 44071 1727204748.73190: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 44071 1727204748.73208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204748.73229: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 44071 1727204748.73276: variable '__network_wireless_connections_defined' from source: role '' defaults 44071 1727204748.73462: variable 'network_connections' from source: include params 44071 1727204748.73466: variable 'interface' from source: play vars 44071 1727204748.73520: variable 'interface' from source: play vars 44071 1727204748.73546: Evaluated conditional (__network_wpa_supplicant_required): False 44071 1727204748.73550: when evaluation is False, skipping this task 44071 1727204748.73552: _execute() done 44071 1727204748.73555: dumping result to json 44071 1727204748.73557: done dumping result, returning 44071 1727204748.73568: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-c964-7471-0000000026a1] 44071 1727204748.73582: sending task result for task 127b8e07-fff9-c964-7471-0000000026a1 44071 1727204748.73680: done sending task result for task 127b8e07-fff9-c964-7471-0000000026a1 44071 1727204748.73683: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 44071 1727204748.73740: no more pending results, returning what we have 44071 1727204748.73744: results queue empty 44071 1727204748.73745: checking for any_errors_fatal 44071 1727204748.73776: done checking for any_errors_fatal 44071 1727204748.73777: checking for max_fail_percentage 44071 1727204748.73778: done checking for max_fail_percentage 44071 1727204748.73779: checking to see if all hosts have failed and the running result is not ok 44071 1727204748.73780: done checking to see if all hosts have failed 44071 1727204748.73781: getting the remaining hosts for this loop 44071 1727204748.73782: done getting the remaining hosts for this loop 44071 1727204748.73787: getting the next task for host managed-node2 44071 1727204748.73797: done getting next task for host managed-node2 44071 1727204748.73801: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204748.73807: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204748.73836: getting variables 44071 1727204748.73838: in VariableManager get_vars() 44071 1727204748.73895: Calling all_inventory to load vars for managed-node2 44071 1727204748.73898: Calling groups_inventory to load vars for managed-node2 44071 1727204748.73900: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204748.73911: Calling all_plugins_play to load vars for managed-node2 44071 1727204748.73914: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204748.73917: Calling groups_plugins_play to load vars for managed-node2 44071 1727204748.74996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204748.76254: done with get_vars() 44071 1727204748.76292: done getting variables 44071 1727204748.76343: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:05:48 -0400 (0:00:00.076) 0:02:41.080 ***** 44071 1727204748.76377: entering _queue_task() for managed-node2/service 44071 1727204748.76689: worker is 1 (out of 1 available) 44071 1727204748.76705: exiting _queue_task() for managed-node2/service 44071 1727204748.76721: done queuing things up, now waiting for results queue to drain 44071 1727204748.76723: waiting for pending results... 44071 1727204748.76939: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 44071 1727204748.77062: in run() - task 127b8e07-fff9-c964-7471-0000000026a2 44071 1727204748.77087: variable 'ansible_search_path' from source: unknown 44071 1727204748.77090: variable 'ansible_search_path' from source: unknown 44071 1727204748.77124: calling self._execute() 44071 1727204748.77223: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204748.77231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204748.77239: variable 'omit' from source: magic vars 44071 1727204748.77577: variable 'ansible_distribution_major_version' from source: facts 44071 1727204748.77588: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204748.77687: variable 'network_provider' from source: set_fact 44071 1727204748.77692: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204748.77695: when evaluation is False, skipping this task 44071 1727204748.77698: _execute() done 44071 1727204748.77701: dumping result to json 44071 1727204748.77706: done dumping result, returning 44071 1727204748.77715: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-c964-7471-0000000026a2] 44071 1727204748.77717: sending task result for task 127b8e07-fff9-c964-7471-0000000026a2 44071 1727204748.77823: done sending task result for task 127b8e07-fff9-c964-7471-0000000026a2 44071 1727204748.77831: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 44071 1727204748.77886: no more pending results, returning what we have 44071 1727204748.77890: results queue empty 44071 1727204748.77891: checking for any_errors_fatal 44071 1727204748.77904: done checking for any_errors_fatal 44071 1727204748.77905: checking for max_fail_percentage 44071 1727204748.77907: done checking for max_fail_percentage 44071 1727204748.77908: checking to see if all hosts have failed and the running result is not ok 44071 1727204748.77908: done checking to see if all hosts have failed 44071 1727204748.77909: getting the remaining hosts for this loop 44071 1727204748.77911: done getting the remaining hosts for this loop 44071 1727204748.77916: getting the next task for host managed-node2 44071 1727204748.77925: done getting next task for host managed-node2 44071 1727204748.77931: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204748.77937: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204748.77977: getting variables 44071 1727204748.77979: in VariableManager get_vars() 44071 1727204748.78030: Calling all_inventory to load vars for managed-node2 44071 1727204748.78033: Calling groups_inventory to load vars for managed-node2 44071 1727204748.78036: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204748.78046: Calling all_plugins_play to load vars for managed-node2 44071 1727204748.78049: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204748.78051: Calling groups_plugins_play to load vars for managed-node2 44071 1727204748.79275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204748.80507: done with get_vars() 44071 1727204748.80540: done getting variables 44071 1727204748.80596: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:05:48 -0400 (0:00:00.042) 0:02:41.122 ***** 44071 1727204748.80627: entering _queue_task() for managed-node2/copy 44071 1727204748.80946: worker is 1 (out of 1 available) 44071 1727204748.80962: exiting _queue_task() for managed-node2/copy 44071 1727204748.80978: done queuing things up, now waiting for results queue to drain 44071 1727204748.80980: waiting for pending results... 44071 1727204748.81193: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 44071 1727204748.81326: in run() - task 127b8e07-fff9-c964-7471-0000000026a3 44071 1727204748.81340: variable 'ansible_search_path' from source: unknown 44071 1727204748.81343: variable 'ansible_search_path' from source: unknown 44071 1727204748.81378: calling self._execute() 44071 1727204748.81470: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204748.81478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204748.81486: variable 'omit' from source: magic vars 44071 1727204748.81827: variable 'ansible_distribution_major_version' from source: facts 44071 1727204748.81839: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204748.81978: variable 'network_provider' from source: set_fact 44071 1727204748.81982: Evaluated conditional (network_provider == "initscripts"): False 44071 1727204748.81985: when evaluation is False, skipping this task 44071 1727204748.81988: _execute() done 44071 1727204748.81991: dumping result to json 44071 1727204748.81994: done dumping result, returning 44071 1727204748.81997: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-c964-7471-0000000026a3] 44071 1727204748.82000: sending task result for task 127b8e07-fff9-c964-7471-0000000026a3 44071 1727204748.82088: done sending task result for task 127b8e07-fff9-c964-7471-0000000026a3 44071 1727204748.82091: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 44071 1727204748.82149: no more pending results, returning what we have 44071 1727204748.82153: results queue empty 44071 1727204748.82154: checking for any_errors_fatal 44071 1727204748.82162: done checking for any_errors_fatal 44071 1727204748.82163: checking for max_fail_percentage 44071 1727204748.82165: done checking for max_fail_percentage 44071 1727204748.82167: checking to see if all hosts have failed and the running result is not ok 44071 1727204748.82168: done checking to see if all hosts have failed 44071 1727204748.82169: getting the remaining hosts for this loop 44071 1727204748.82170: done getting the remaining hosts for this loop 44071 1727204748.82175: getting the next task for host managed-node2 44071 1727204748.82185: done getting next task for host managed-node2 44071 1727204748.82189: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204748.82195: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204748.82234: getting variables 44071 1727204748.82236: in VariableManager get_vars() 44071 1727204748.82282: Calling all_inventory to load vars for managed-node2 44071 1727204748.82285: Calling groups_inventory to load vars for managed-node2 44071 1727204748.82287: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204748.82297: Calling all_plugins_play to load vars for managed-node2 44071 1727204748.82299: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204748.82302: Calling groups_plugins_play to load vars for managed-node2 44071 1727204748.83491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204748.84726: done with get_vars() 44071 1727204748.84758: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:05:48 -0400 (0:00:00.042) 0:02:41.164 ***** 44071 1727204748.84838: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204748.85149: worker is 1 (out of 1 available) 44071 1727204748.85168: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 44071 1727204748.85182: done queuing things up, now waiting for results queue to drain 44071 1727204748.85184: waiting for pending results... 44071 1727204748.85402: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 44071 1727204748.85526: in run() - task 127b8e07-fff9-c964-7471-0000000026a4 44071 1727204748.85536: variable 'ansible_search_path' from source: unknown 44071 1727204748.85539: variable 'ansible_search_path' from source: unknown 44071 1727204748.85576: calling self._execute() 44071 1727204748.85676: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204748.85682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204748.85691: variable 'omit' from source: magic vars 44071 1727204748.86027: variable 'ansible_distribution_major_version' from source: facts 44071 1727204748.86039: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204748.86044: variable 'omit' from source: magic vars 44071 1727204748.86106: variable 'omit' from source: magic vars 44071 1727204748.86237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204748.87971: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204748.88026: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204748.88056: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204748.88085: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204748.88109: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204748.88181: variable 'network_provider' from source: set_fact 44071 1727204748.88293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204748.88314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204748.88340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204748.88369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204748.88381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204748.88446: variable 'omit' from source: magic vars 44071 1727204748.88532: variable 'omit' from source: magic vars 44071 1727204748.88610: variable 'network_connections' from source: include params 44071 1727204748.88621: variable 'interface' from source: play vars 44071 1727204748.88671: variable 'interface' from source: play vars 44071 1727204748.88787: variable 'omit' from source: magic vars 44071 1727204748.88794: variable '__lsr_ansible_managed' from source: task vars 44071 1727204748.88840: variable '__lsr_ansible_managed' from source: task vars 44071 1727204748.88998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 44071 1727204748.89156: Loaded config def from plugin (lookup/template) 44071 1727204748.89160: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 44071 1727204748.89186: File lookup term: get_ansible_managed.j2 44071 1727204748.89189: variable 'ansible_search_path' from source: unknown 44071 1727204748.89196: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 44071 1727204748.89209: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 44071 1727204748.89226: variable 'ansible_search_path' from source: unknown 44071 1727204748.93939: variable 'ansible_managed' from source: unknown 44071 1727204748.94064: variable 'omit' from source: magic vars 44071 1727204748.94101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204748.94125: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204748.94145: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204748.94159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204748.94193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204748.94198: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204748.94201: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204748.94204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204748.94278: Set connection var ansible_connection to ssh 44071 1727204748.94284: Set connection var ansible_timeout to 10 44071 1727204748.94290: Set connection var ansible_pipelining to False 44071 1727204748.94296: Set connection var ansible_shell_type to sh 44071 1727204748.94302: Set connection var ansible_shell_executable to /bin/sh 44071 1727204748.94309: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204748.94330: variable 'ansible_shell_executable' from source: unknown 44071 1727204748.94335: variable 'ansible_connection' from source: unknown 44071 1727204748.94338: variable 'ansible_module_compression' from source: unknown 44071 1727204748.94340: variable 'ansible_shell_type' from source: unknown 44071 1727204748.94343: variable 'ansible_shell_executable' from source: unknown 44071 1727204748.94347: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204748.94351: variable 'ansible_pipelining' from source: unknown 44071 1727204748.94354: variable 'ansible_timeout' from source: unknown 44071 1727204748.94358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204748.94471: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204748.94483: variable 'omit' from source: magic vars 44071 1727204748.94486: starting attempt loop 44071 1727204748.94489: running the handler 44071 1727204748.94504: _low_level_execute_command(): starting 44071 1727204748.94510: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204748.95061: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204748.95068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204748.95072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204748.95139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204748.95142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204748.95143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204748.95214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204748.96976: stdout chunk (state=3): >>>/root <<< 44071 1727204748.97079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204748.97148: stderr chunk (state=3): >>><<< 44071 1727204748.97152: stdout chunk (state=3): >>><<< 44071 1727204748.97179: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204748.97187: _low_level_execute_command(): starting 44071 1727204748.97193: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204748.9717426-53242-617900218313 `" && echo ansible-tmp-1727204748.9717426-53242-617900218313="` echo /root/.ansible/tmp/ansible-tmp-1727204748.9717426-53242-617900218313 `" ) && sleep 0' 44071 1727204748.97696: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204748.97700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204748.97703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204748.97705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204748.97770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204748.97774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204748.97778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204748.97849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204748.99839: stdout chunk (state=3): >>>ansible-tmp-1727204748.9717426-53242-617900218313=/root/.ansible/tmp/ansible-tmp-1727204748.9717426-53242-617900218313 <<< 44071 1727204748.99952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204749.00021: stderr chunk (state=3): >>><<< 44071 1727204749.00024: stdout chunk (state=3): >>><<< 44071 1727204749.00044: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204748.9717426-53242-617900218313=/root/.ansible/tmp/ansible-tmp-1727204748.9717426-53242-617900218313 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204749.00092: variable 'ansible_module_compression' from source: unknown 44071 1727204749.00130: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 44071 1727204749.00160: variable 'ansible_facts' from source: unknown 44071 1727204749.00231: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204748.9717426-53242-617900218313/AnsiballZ_network_connections.py 44071 1727204749.00348: Sending initial data 44071 1727204749.00352: Sent initial data (165 bytes) 44071 1727204749.00836: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204749.00841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.00869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.00915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204749.00919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204749.00921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204749.00996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204749.02627: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204749.02692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204749.02759: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp19s5ll8h /root/.ansible/tmp/ansible-tmp-1727204748.9717426-53242-617900218313/AnsiballZ_network_connections.py <<< 44071 1727204749.02768: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204748.9717426-53242-617900218313/AnsiballZ_network_connections.py" <<< 44071 1727204749.02832: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp19s5ll8h" to remote "/root/.ansible/tmp/ansible-tmp-1727204748.9717426-53242-617900218313/AnsiballZ_network_connections.py" <<< 44071 1727204749.02835: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204748.9717426-53242-617900218313/AnsiballZ_network_connections.py" <<< 44071 1727204749.03716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204749.03792: stderr chunk (state=3): >>><<< 44071 1727204749.03795: stdout chunk (state=3): >>><<< 44071 1727204749.03818: done transferring module to remote 44071 1727204749.03832: _low_level_execute_command(): starting 44071 1727204749.03835: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204748.9717426-53242-617900218313/ /root/.ansible/tmp/ansible-tmp-1727204748.9717426-53242-617900218313/AnsiballZ_network_connections.py && sleep 0' 44071 1727204749.04319: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204749.04325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.04330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204749.04333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.04392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204749.04399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204749.04402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204749.04474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204749.06304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204749.06363: stderr chunk (state=3): >>><<< 44071 1727204749.06369: stdout chunk (state=3): >>><<< 44071 1727204749.06382: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204749.06389: _low_level_execute_command(): starting 44071 1727204749.06396: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204748.9717426-53242-617900218313/AnsiballZ_network_connections.py && sleep 0' 44071 1727204749.06889: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204749.06893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.06895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204749.06898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.06961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204749.06964: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204749.06970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204749.07043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204749.35439: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 44071 1727204749.37197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204749.37255: stderr chunk (state=3): >>><<< 44071 1727204749.37259: stdout chunk (state=3): >>><<< 44071 1727204749.37281: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204749.37316: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204748.9717426-53242-617900218313/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204749.37324: _low_level_execute_command(): starting 44071 1727204749.37329: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204748.9717426-53242-617900218313/ > /dev/null 2>&1 && sleep 0' 44071 1727204749.37826: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204749.37833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.37836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204749.37838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.37893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204749.37897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204749.37903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204749.37979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204749.39905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204749.39961: stderr chunk (state=3): >>><<< 44071 1727204749.39967: stdout chunk (state=3): >>><<< 44071 1727204749.39980: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204749.39987: handler run complete 44071 1727204749.40013: attempt loop complete, returning result 44071 1727204749.40016: _execute() done 44071 1727204749.40018: dumping result to json 44071 1727204749.40028: done dumping result, returning 44071 1727204749.40037: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-c964-7471-0000000026a4] 44071 1727204749.40042: sending task result for task 127b8e07-fff9-c964-7471-0000000026a4 44071 1727204749.40159: done sending task result for task 127b8e07-fff9-c964-7471-0000000026a4 44071 1727204749.40162: WORKER PROCESS EXITING ok: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 44071 1727204749.40315: no more pending results, returning what we have 44071 1727204749.40319: results queue empty 44071 1727204749.40320: checking for any_errors_fatal 44071 1727204749.40325: done checking for any_errors_fatal 44071 1727204749.40326: checking for max_fail_percentage 44071 1727204749.40328: done checking for max_fail_percentage 44071 1727204749.40330: checking to see if all hosts have failed and the running result is not ok 44071 1727204749.40331: done checking to see if all hosts have failed 44071 1727204749.40332: getting the remaining hosts for this loop 44071 1727204749.40333: done getting the remaining hosts for this loop 44071 1727204749.40338: getting the next task for host managed-node2 44071 1727204749.40345: done getting next task for host managed-node2 44071 1727204749.40349: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204749.40367: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204749.40381: getting variables 44071 1727204749.40383: in VariableManager get_vars() 44071 1727204749.40436: Calling all_inventory to load vars for managed-node2 44071 1727204749.40439: Calling groups_inventory to load vars for managed-node2 44071 1727204749.40440: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204749.40450: Calling all_plugins_play to load vars for managed-node2 44071 1727204749.40453: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204749.40456: Calling groups_plugins_play to load vars for managed-node2 44071 1727204749.41496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204749.42738: done with get_vars() 44071 1727204749.42773: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:05:49 -0400 (0:00:00.580) 0:02:41.744 ***** 44071 1727204749.42849: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204749.43162: worker is 1 (out of 1 available) 44071 1727204749.43179: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 44071 1727204749.43193: done queuing things up, now waiting for results queue to drain 44071 1727204749.43195: waiting for pending results... 44071 1727204749.43403: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 44071 1727204749.43532: in run() - task 127b8e07-fff9-c964-7471-0000000026a5 44071 1727204749.43547: variable 'ansible_search_path' from source: unknown 44071 1727204749.43550: variable 'ansible_search_path' from source: unknown 44071 1727204749.43583: calling self._execute() 44071 1727204749.43673: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204749.43679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204749.43689: variable 'omit' from source: magic vars 44071 1727204749.44019: variable 'ansible_distribution_major_version' from source: facts 44071 1727204749.44032: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204749.44124: variable 'network_state' from source: role '' defaults 44071 1727204749.44136: Evaluated conditional (network_state != {}): False 44071 1727204749.44139: when evaluation is False, skipping this task 44071 1727204749.44142: _execute() done 44071 1727204749.44144: dumping result to json 44071 1727204749.44147: done dumping result, returning 44071 1727204749.44155: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-c964-7471-0000000026a5] 44071 1727204749.44160: sending task result for task 127b8e07-fff9-c964-7471-0000000026a5 44071 1727204749.44269: done sending task result for task 127b8e07-fff9-c964-7471-0000000026a5 44071 1727204749.44272: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 44071 1727204749.44333: no more pending results, returning what we have 44071 1727204749.44338: results queue empty 44071 1727204749.44340: checking for any_errors_fatal 44071 1727204749.44359: done checking for any_errors_fatal 44071 1727204749.44360: checking for max_fail_percentage 44071 1727204749.44362: done checking for max_fail_percentage 44071 1727204749.44363: checking to see if all hosts have failed and the running result is not ok 44071 1727204749.44363: done checking to see if all hosts have failed 44071 1727204749.44364: getting the remaining hosts for this loop 44071 1727204749.44367: done getting the remaining hosts for this loop 44071 1727204749.44374: getting the next task for host managed-node2 44071 1727204749.44385: done getting next task for host managed-node2 44071 1727204749.44389: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204749.44394: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204749.44423: getting variables 44071 1727204749.44425: in VariableManager get_vars() 44071 1727204749.44481: Calling all_inventory to load vars for managed-node2 44071 1727204749.44484: Calling groups_inventory to load vars for managed-node2 44071 1727204749.44486: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204749.44498: Calling all_plugins_play to load vars for managed-node2 44071 1727204749.44500: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204749.44503: Calling groups_plugins_play to load vars for managed-node2 44071 1727204749.45662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204749.46883: done with get_vars() 44071 1727204749.46909: done getting variables 44071 1727204749.46960: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:05:49 -0400 (0:00:00.041) 0:02:41.786 ***** 44071 1727204749.46994: entering _queue_task() for managed-node2/debug 44071 1727204749.47303: worker is 1 (out of 1 available) 44071 1727204749.47318: exiting _queue_task() for managed-node2/debug 44071 1727204749.47336: done queuing things up, now waiting for results queue to drain 44071 1727204749.47338: waiting for pending results... 44071 1727204749.47548: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 44071 1727204749.47672: in run() - task 127b8e07-fff9-c964-7471-0000000026a6 44071 1727204749.47688: variable 'ansible_search_path' from source: unknown 44071 1727204749.47692: variable 'ansible_search_path' from source: unknown 44071 1727204749.47726: calling self._execute() 44071 1727204749.47817: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204749.47823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204749.47835: variable 'omit' from source: magic vars 44071 1727204749.48162: variable 'ansible_distribution_major_version' from source: facts 44071 1727204749.48175: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204749.48182: variable 'omit' from source: magic vars 44071 1727204749.48238: variable 'omit' from source: magic vars 44071 1727204749.48266: variable 'omit' from source: magic vars 44071 1727204749.48305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204749.48341: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204749.48358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204749.48375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204749.48386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204749.48412: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204749.48416: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204749.48420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204749.48504: Set connection var ansible_connection to ssh 44071 1727204749.48510: Set connection var ansible_timeout to 10 44071 1727204749.48516: Set connection var ansible_pipelining to False 44071 1727204749.48521: Set connection var ansible_shell_type to sh 44071 1727204749.48527: Set connection var ansible_shell_executable to /bin/sh 44071 1727204749.48535: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204749.48557: variable 'ansible_shell_executable' from source: unknown 44071 1727204749.48561: variable 'ansible_connection' from source: unknown 44071 1727204749.48564: variable 'ansible_module_compression' from source: unknown 44071 1727204749.48568: variable 'ansible_shell_type' from source: unknown 44071 1727204749.48572: variable 'ansible_shell_executable' from source: unknown 44071 1727204749.48574: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204749.48576: variable 'ansible_pipelining' from source: unknown 44071 1727204749.48579: variable 'ansible_timeout' from source: unknown 44071 1727204749.48585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204749.48703: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204749.48714: variable 'omit' from source: magic vars 44071 1727204749.48719: starting attempt loop 44071 1727204749.48722: running the handler 44071 1727204749.48840: variable '__network_connections_result' from source: set_fact 44071 1727204749.48889: handler run complete 44071 1727204749.48906: attempt loop complete, returning result 44071 1727204749.48909: _execute() done 44071 1727204749.48912: dumping result to json 44071 1727204749.48915: done dumping result, returning 44071 1727204749.48925: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-c964-7471-0000000026a6] 44071 1727204749.48928: sending task result for task 127b8e07-fff9-c964-7471-0000000026a6 44071 1727204749.49032: done sending task result for task 127b8e07-fff9-c964-7471-0000000026a6 44071 1727204749.49035: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 44071 1727204749.49123: no more pending results, returning what we have 44071 1727204749.49127: results queue empty 44071 1727204749.49127: checking for any_errors_fatal 44071 1727204749.49139: done checking for any_errors_fatal 44071 1727204749.49140: checking for max_fail_percentage 44071 1727204749.49141: done checking for max_fail_percentage 44071 1727204749.49143: checking to see if all hosts have failed and the running result is not ok 44071 1727204749.49144: done checking to see if all hosts have failed 44071 1727204749.49144: getting the remaining hosts for this loop 44071 1727204749.49146: done getting the remaining hosts for this loop 44071 1727204749.49150: getting the next task for host managed-node2 44071 1727204749.49158: done getting next task for host managed-node2 44071 1727204749.49162: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204749.49175: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204749.49189: getting variables 44071 1727204749.49191: in VariableManager get_vars() 44071 1727204749.49240: Calling all_inventory to load vars for managed-node2 44071 1727204749.49243: Calling groups_inventory to load vars for managed-node2 44071 1727204749.49245: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204749.49255: Calling all_plugins_play to load vars for managed-node2 44071 1727204749.49258: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204749.49260: Calling groups_plugins_play to load vars for managed-node2 44071 1727204749.50490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204749.51754: done with get_vars() 44071 1727204749.51790: done getting variables 44071 1727204749.51843: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:05:49 -0400 (0:00:00.048) 0:02:41.835 ***** 44071 1727204749.51883: entering _queue_task() for managed-node2/debug 44071 1727204749.52202: worker is 1 (out of 1 available) 44071 1727204749.52219: exiting _queue_task() for managed-node2/debug 44071 1727204749.52236: done queuing things up, now waiting for results queue to drain 44071 1727204749.52238: waiting for pending results... 44071 1727204749.52462: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 44071 1727204749.52581: in run() - task 127b8e07-fff9-c964-7471-0000000026a7 44071 1727204749.52596: variable 'ansible_search_path' from source: unknown 44071 1727204749.52600: variable 'ansible_search_path' from source: unknown 44071 1727204749.52635: calling self._execute() 44071 1727204749.52727: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204749.52734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204749.52742: variable 'omit' from source: magic vars 44071 1727204749.53070: variable 'ansible_distribution_major_version' from source: facts 44071 1727204749.53081: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204749.53087: variable 'omit' from source: magic vars 44071 1727204749.53134: variable 'omit' from source: magic vars 44071 1727204749.53168: variable 'omit' from source: magic vars 44071 1727204749.53205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204749.53235: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204749.53256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204749.53273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204749.53284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204749.53309: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204749.53313: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204749.53315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204749.53397: Set connection var ansible_connection to ssh 44071 1727204749.53403: Set connection var ansible_timeout to 10 44071 1727204749.53409: Set connection var ansible_pipelining to False 44071 1727204749.53414: Set connection var ansible_shell_type to sh 44071 1727204749.53420: Set connection var ansible_shell_executable to /bin/sh 44071 1727204749.53426: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204749.53448: variable 'ansible_shell_executable' from source: unknown 44071 1727204749.53451: variable 'ansible_connection' from source: unknown 44071 1727204749.53455: variable 'ansible_module_compression' from source: unknown 44071 1727204749.53460: variable 'ansible_shell_type' from source: unknown 44071 1727204749.53463: variable 'ansible_shell_executable' from source: unknown 44071 1727204749.53465: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204749.53467: variable 'ansible_pipelining' from source: unknown 44071 1727204749.53471: variable 'ansible_timeout' from source: unknown 44071 1727204749.53473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204749.53595: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204749.53602: variable 'omit' from source: magic vars 44071 1727204749.53608: starting attempt loop 44071 1727204749.53612: running the handler 44071 1727204749.53656: variable '__network_connections_result' from source: set_fact 44071 1727204749.53728: variable '__network_connections_result' from source: set_fact 44071 1727204749.53818: handler run complete 44071 1727204749.53839: attempt loop complete, returning result 44071 1727204749.53842: _execute() done 44071 1727204749.53845: dumping result to json 44071 1727204749.53850: done dumping result, returning 44071 1727204749.53859: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-c964-7471-0000000026a7] 44071 1727204749.53863: sending task result for task 127b8e07-fff9-c964-7471-0000000026a7 44071 1727204749.53970: done sending task result for task 127b8e07-fff9-c964-7471-0000000026a7 44071 1727204749.53973: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 44071 1727204749.54082: no more pending results, returning what we have 44071 1727204749.54086: results queue empty 44071 1727204749.54087: checking for any_errors_fatal 44071 1727204749.54095: done checking for any_errors_fatal 44071 1727204749.54096: checking for max_fail_percentage 44071 1727204749.54097: done checking for max_fail_percentage 44071 1727204749.54098: checking to see if all hosts have failed and the running result is not ok 44071 1727204749.54099: done checking to see if all hosts have failed 44071 1727204749.54100: getting the remaining hosts for this loop 44071 1727204749.54101: done getting the remaining hosts for this loop 44071 1727204749.54106: getting the next task for host managed-node2 44071 1727204749.54114: done getting next task for host managed-node2 44071 1727204749.54118: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204749.54122: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204749.54138: getting variables 44071 1727204749.54139: in VariableManager get_vars() 44071 1727204749.54192: Calling all_inventory to load vars for managed-node2 44071 1727204749.54195: Calling groups_inventory to load vars for managed-node2 44071 1727204749.54197: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204749.54212: Calling all_plugins_play to load vars for managed-node2 44071 1727204749.54215: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204749.54218: Calling groups_plugins_play to load vars for managed-node2 44071 1727204749.55277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204749.56538: done with get_vars() 44071 1727204749.56578: done getting variables 44071 1727204749.56628: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:05:49 -0400 (0:00:00.047) 0:02:41.883 ***** 44071 1727204749.56660: entering _queue_task() for managed-node2/debug 44071 1727204749.56978: worker is 1 (out of 1 available) 44071 1727204749.56995: exiting _queue_task() for managed-node2/debug 44071 1727204749.57010: done queuing things up, now waiting for results queue to drain 44071 1727204749.57012: waiting for pending results... 44071 1727204749.57235: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 44071 1727204749.57353: in run() - task 127b8e07-fff9-c964-7471-0000000026a8 44071 1727204749.57375: variable 'ansible_search_path' from source: unknown 44071 1727204749.57379: variable 'ansible_search_path' from source: unknown 44071 1727204749.57407: calling self._execute() 44071 1727204749.57499: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204749.57504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204749.57513: variable 'omit' from source: magic vars 44071 1727204749.57844: variable 'ansible_distribution_major_version' from source: facts 44071 1727204749.57855: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204749.57954: variable 'network_state' from source: role '' defaults 44071 1727204749.57962: Evaluated conditional (network_state != {}): False 44071 1727204749.57967: when evaluation is False, skipping this task 44071 1727204749.57971: _execute() done 44071 1727204749.57973: dumping result to json 44071 1727204749.57976: done dumping result, returning 44071 1727204749.57985: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-c964-7471-0000000026a8] 44071 1727204749.57990: sending task result for task 127b8e07-fff9-c964-7471-0000000026a8 44071 1727204749.58100: done sending task result for task 127b8e07-fff9-c964-7471-0000000026a8 44071 1727204749.58103: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 44071 1727204749.58183: no more pending results, returning what we have 44071 1727204749.58187: results queue empty 44071 1727204749.58190: checking for any_errors_fatal 44071 1727204749.58199: done checking for any_errors_fatal 44071 1727204749.58200: checking for max_fail_percentage 44071 1727204749.58201: done checking for max_fail_percentage 44071 1727204749.58202: checking to see if all hosts have failed and the running result is not ok 44071 1727204749.58203: done checking to see if all hosts have failed 44071 1727204749.58204: getting the remaining hosts for this loop 44071 1727204749.58205: done getting the remaining hosts for this loop 44071 1727204749.58210: getting the next task for host managed-node2 44071 1727204749.58219: done getting next task for host managed-node2 44071 1727204749.58223: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204749.58227: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204749.58258: getting variables 44071 1727204749.58259: in VariableManager get_vars() 44071 1727204749.58311: Calling all_inventory to load vars for managed-node2 44071 1727204749.58314: Calling groups_inventory to load vars for managed-node2 44071 1727204749.58316: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204749.58326: Calling all_plugins_play to load vars for managed-node2 44071 1727204749.58331: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204749.58334: Calling groups_plugins_play to load vars for managed-node2 44071 1727204749.59516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204749.60756: done with get_vars() 44071 1727204749.60789: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:05:49 -0400 (0:00:00.042) 0:02:41.925 ***** 44071 1727204749.60876: entering _queue_task() for managed-node2/ping 44071 1727204749.61191: worker is 1 (out of 1 available) 44071 1727204749.61209: exiting _queue_task() for managed-node2/ping 44071 1727204749.61223: done queuing things up, now waiting for results queue to drain 44071 1727204749.61225: waiting for pending results... 44071 1727204749.61439: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 44071 1727204749.61568: in run() - task 127b8e07-fff9-c964-7471-0000000026a9 44071 1727204749.61587: variable 'ansible_search_path' from source: unknown 44071 1727204749.61591: variable 'ansible_search_path' from source: unknown 44071 1727204749.61627: calling self._execute() 44071 1727204749.61719: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204749.61725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204749.61737: variable 'omit' from source: magic vars 44071 1727204749.62063: variable 'ansible_distribution_major_version' from source: facts 44071 1727204749.62086: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204749.62092: variable 'omit' from source: magic vars 44071 1727204749.62151: variable 'omit' from source: magic vars 44071 1727204749.62181: variable 'omit' from source: magic vars 44071 1727204749.62220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204749.62252: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204749.62270: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204749.62286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204749.62297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204749.62323: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204749.62327: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204749.62329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204749.62412: Set connection var ansible_connection to ssh 44071 1727204749.62418: Set connection var ansible_timeout to 10 44071 1727204749.62424: Set connection var ansible_pipelining to False 44071 1727204749.62429: Set connection var ansible_shell_type to sh 44071 1727204749.62437: Set connection var ansible_shell_executable to /bin/sh 44071 1727204749.62450: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204749.62467: variable 'ansible_shell_executable' from source: unknown 44071 1727204749.62470: variable 'ansible_connection' from source: unknown 44071 1727204749.62474: variable 'ansible_module_compression' from source: unknown 44071 1727204749.62476: variable 'ansible_shell_type' from source: unknown 44071 1727204749.62478: variable 'ansible_shell_executable' from source: unknown 44071 1727204749.62481: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204749.62485: variable 'ansible_pipelining' from source: unknown 44071 1727204749.62488: variable 'ansible_timeout' from source: unknown 44071 1727204749.62492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204749.62670: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204749.62680: variable 'omit' from source: magic vars 44071 1727204749.62683: starting attempt loop 44071 1727204749.62685: running the handler 44071 1727204749.62699: _low_level_execute_command(): starting 44071 1727204749.62706: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204749.63277: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204749.63283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204749.63287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.63347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204749.63351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204749.63418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204749.65106: stdout chunk (state=3): >>>/root <<< 44071 1727204749.65211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204749.65279: stderr chunk (state=3): >>><<< 44071 1727204749.65283: stdout chunk (state=3): >>><<< 44071 1727204749.65311: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204749.65324: _low_level_execute_command(): starting 44071 1727204749.65333: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204749.6531043-53261-264218518593383 `" && echo ansible-tmp-1727204749.6531043-53261-264218518593383="` echo /root/.ansible/tmp/ansible-tmp-1727204749.6531043-53261-264218518593383 `" ) && sleep 0' 44071 1727204749.65853: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204749.65857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204749.65861: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204749.65880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.65918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204749.65921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204749.65924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204749.66001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204749.68000: stdout chunk (state=3): >>>ansible-tmp-1727204749.6531043-53261-264218518593383=/root/.ansible/tmp/ansible-tmp-1727204749.6531043-53261-264218518593383 <<< 44071 1727204749.68121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204749.68173: stderr chunk (state=3): >>><<< 44071 1727204749.68176: stdout chunk (state=3): >>><<< 44071 1727204749.68194: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204749.6531043-53261-264218518593383=/root/.ansible/tmp/ansible-tmp-1727204749.6531043-53261-264218518593383 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204749.68247: variable 'ansible_module_compression' from source: unknown 44071 1727204749.68285: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 44071 1727204749.68321: variable 'ansible_facts' from source: unknown 44071 1727204749.68377: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204749.6531043-53261-264218518593383/AnsiballZ_ping.py 44071 1727204749.68494: Sending initial data 44071 1727204749.68497: Sent initial data (153 bytes) 44071 1727204749.69015: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204749.69019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.69023: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204749.69025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.69073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204749.69076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204749.69090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204749.69155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204749.70764: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204749.70830: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204749.70900: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp_3tw9bfb /root/.ansible/tmp/ansible-tmp-1727204749.6531043-53261-264218518593383/AnsiballZ_ping.py <<< 44071 1727204749.70903: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204749.6531043-53261-264218518593383/AnsiballZ_ping.py" <<< 44071 1727204749.70969: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp_3tw9bfb" to remote "/root/.ansible/tmp/ansible-tmp-1727204749.6531043-53261-264218518593383/AnsiballZ_ping.py" <<< 44071 1727204749.70973: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204749.6531043-53261-264218518593383/AnsiballZ_ping.py" <<< 44071 1727204749.71618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204749.71696: stderr chunk (state=3): >>><<< 44071 1727204749.71700: stdout chunk (state=3): >>><<< 44071 1727204749.71722: done transferring module to remote 44071 1727204749.71738: _low_level_execute_command(): starting 44071 1727204749.71741: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204749.6531043-53261-264218518593383/ /root/.ansible/tmp/ansible-tmp-1727204749.6531043-53261-264218518593383/AnsiballZ_ping.py && sleep 0' 44071 1727204749.72239: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204749.72243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.72246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204749.72249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204749.72251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.72313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204749.72320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204749.72322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204749.72384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204749.74216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204749.74283: stderr chunk (state=3): >>><<< 44071 1727204749.74287: stdout chunk (state=3): >>><<< 44071 1727204749.74302: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204749.74306: _low_level_execute_command(): starting 44071 1727204749.74311: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204749.6531043-53261-264218518593383/AnsiballZ_ping.py && sleep 0' 44071 1727204749.74825: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204749.74829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.74832: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204749.74835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204749.74892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204749.74896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204749.74980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204749.91086: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 44071 1727204749.92440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204749.92445: stdout chunk (state=3): >>><<< 44071 1727204749.92448: stderr chunk (state=3): >>><<< 44071 1727204749.92610: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204749.92616: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204749.6531043-53261-264218518593383/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204749.92619: _low_level_execute_command(): starting 44071 1727204749.92621: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204749.6531043-53261-264218518593383/ > /dev/null 2>&1 && sleep 0' 44071 1727204749.93293: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204749.93425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204749.93482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204749.93550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204749.95586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204749.95602: stdout chunk (state=3): >>><<< 44071 1727204749.95616: stderr chunk (state=3): >>><<< 44071 1727204749.95773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204749.95782: handler run complete 44071 1727204749.95785: attempt loop complete, returning result 44071 1727204749.95787: _execute() done 44071 1727204749.95790: dumping result to json 44071 1727204749.95792: done dumping result, returning 44071 1727204749.95794: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-c964-7471-0000000026a9] 44071 1727204749.95796: sending task result for task 127b8e07-fff9-c964-7471-0000000026a9 44071 1727204749.95876: done sending task result for task 127b8e07-fff9-c964-7471-0000000026a9 44071 1727204749.95879: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 44071 1727204749.95975: no more pending results, returning what we have 44071 1727204749.95980: results queue empty 44071 1727204749.95981: checking for any_errors_fatal 44071 1727204749.96173: done checking for any_errors_fatal 44071 1727204749.96174: checking for max_fail_percentage 44071 1727204749.96176: done checking for max_fail_percentage 44071 1727204749.96178: checking to see if all hosts have failed and the running result is not ok 44071 1727204749.96179: done checking to see if all hosts have failed 44071 1727204749.96179: getting the remaining hosts for this loop 44071 1727204749.96181: done getting the remaining hosts for this loop 44071 1727204749.96186: getting the next task for host managed-node2 44071 1727204749.96199: done getting next task for host managed-node2 44071 1727204749.96202: ^ task is: TASK: meta (role_complete) 44071 1727204749.96208: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204749.96225: getting variables 44071 1727204749.96227: in VariableManager get_vars() 44071 1727204749.96289: Calling all_inventory to load vars for managed-node2 44071 1727204749.96292: Calling groups_inventory to load vars for managed-node2 44071 1727204749.96294: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204749.96307: Calling all_plugins_play to load vars for managed-node2 44071 1727204749.96310: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204749.96313: Calling groups_plugins_play to load vars for managed-node2 44071 1727204749.98589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204750.00973: done with get_vars() 44071 1727204750.01020: done getting variables 44071 1727204750.01132: done queuing things up, now waiting for results queue to drain 44071 1727204750.01134: results queue empty 44071 1727204750.01135: checking for any_errors_fatal 44071 1727204750.01139: done checking for any_errors_fatal 44071 1727204750.01140: checking for max_fail_percentage 44071 1727204750.01141: done checking for max_fail_percentage 44071 1727204750.01142: checking to see if all hosts have failed and the running result is not ok 44071 1727204750.01143: done checking to see if all hosts have failed 44071 1727204750.01144: getting the remaining hosts for this loop 44071 1727204750.01145: done getting the remaining hosts for this loop 44071 1727204750.01148: getting the next task for host managed-node2 44071 1727204750.01154: done getting next task for host managed-node2 44071 1727204750.01157: ^ task is: TASK: Asserts 44071 1727204750.01159: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204750.01163: getting variables 44071 1727204750.01164: in VariableManager get_vars() 44071 1727204750.01181: Calling all_inventory to load vars for managed-node2 44071 1727204750.01184: Calling groups_inventory to load vars for managed-node2 44071 1727204750.01187: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204750.01192: Calling all_plugins_play to load vars for managed-node2 44071 1727204750.01200: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204750.01204: Calling groups_plugins_play to load vars for managed-node2 44071 1727204750.02914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204750.04890: done with get_vars() 44071 1727204750.04920: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 15:05:50 -0400 (0:00:00.441) 0:02:42.366 ***** 44071 1727204750.04990: entering _queue_task() for managed-node2/include_tasks 44071 1727204750.05308: worker is 1 (out of 1 available) 44071 1727204750.05322: exiting _queue_task() for managed-node2/include_tasks 44071 1727204750.05341: done queuing things up, now waiting for results queue to drain 44071 1727204750.05343: waiting for pending results... 44071 1727204750.05552: running TaskExecutor() for managed-node2/TASK: Asserts 44071 1727204750.05647: in run() - task 127b8e07-fff9-c964-7471-0000000020b2 44071 1727204750.05660: variable 'ansible_search_path' from source: unknown 44071 1727204750.05667: variable 'ansible_search_path' from source: unknown 44071 1727204750.05710: variable 'lsr_assert' from source: include params 44071 1727204750.05906: variable 'lsr_assert' from source: include params 44071 1727204750.05971: variable 'omit' from source: magic vars 44071 1727204750.06095: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204750.06104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204750.06116: variable 'omit' from source: magic vars 44071 1727204750.06477: variable 'ansible_distribution_major_version' from source: facts 44071 1727204750.06480: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204750.06483: variable 'item' from source: unknown 44071 1727204750.06486: variable 'item' from source: unknown 44071 1727204750.06522: variable 'item' from source: unknown 44071 1727204750.06605: variable 'item' from source: unknown 44071 1727204750.06913: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204750.06917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204750.06919: variable 'omit' from source: magic vars 44071 1727204750.07104: variable 'ansible_distribution_major_version' from source: facts 44071 1727204750.07115: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204750.07136: variable 'item' from source: unknown 44071 1727204750.07239: variable 'item' from source: unknown 44071 1727204750.07262: variable 'item' from source: unknown 44071 1727204750.07337: variable 'item' from source: unknown 44071 1727204750.07497: dumping result to json 44071 1727204750.07501: done dumping result, returning 44071 1727204750.07504: done running TaskExecutor() for managed-node2/TASK: Asserts [127b8e07-fff9-c964-7471-0000000020b2] 44071 1727204750.07507: sending task result for task 127b8e07-fff9-c964-7471-0000000020b2 44071 1727204750.07562: done sending task result for task 127b8e07-fff9-c964-7471-0000000020b2 44071 1727204750.07567: WORKER PROCESS EXITING 44071 1727204750.07610: no more pending results, returning what we have 44071 1727204750.07617: in VariableManager get_vars() 44071 1727204750.07684: Calling all_inventory to load vars for managed-node2 44071 1727204750.07687: Calling groups_inventory to load vars for managed-node2 44071 1727204750.07691: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204750.07707: Calling all_plugins_play to load vars for managed-node2 44071 1727204750.07711: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204750.07713: Calling groups_plugins_play to load vars for managed-node2 44071 1727204750.08967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204750.10557: done with get_vars() 44071 1727204750.10594: variable 'ansible_search_path' from source: unknown 44071 1727204750.10596: variable 'ansible_search_path' from source: unknown 44071 1727204750.10649: variable 'ansible_search_path' from source: unknown 44071 1727204750.10650: variable 'ansible_search_path' from source: unknown 44071 1727204750.10687: we have included files to process 44071 1727204750.10688: generating all_blocks data 44071 1727204750.10690: done generating all_blocks data 44071 1727204750.10697: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 44071 1727204750.10699: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 44071 1727204750.10701: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 44071 1727204750.10827: in VariableManager get_vars() 44071 1727204750.10855: done with get_vars() 44071 1727204750.10984: done processing included file 44071 1727204750.10987: iterating over new_blocks loaded from include file 44071 1727204750.10989: in VariableManager get_vars() 44071 1727204750.11009: done with get_vars() 44071 1727204750.11011: filtering new block on tags 44071 1727204750.11054: done filtering new block on tags 44071 1727204750.11057: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node2 => (item=tasks/assert_profile_absent.yml) 44071 1727204750.11062: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 44071 1727204750.11063: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 44071 1727204750.11068: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 44071 1727204750.11503: done processing included file 44071 1727204750.11506: iterating over new_blocks loaded from include file 44071 1727204750.11507: in VariableManager get_vars() 44071 1727204750.11531: done with get_vars() 44071 1727204750.11533: filtering new block on tags 44071 1727204750.11586: done filtering new block on tags 44071 1727204750.11589: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml for managed-node2 => (item=tasks/get_NetworkManager_NVR.yml) 44071 1727204750.11594: extending task lists for all hosts with included blocks 44071 1727204750.12912: done extending task lists 44071 1727204750.12914: done processing included files 44071 1727204750.12915: results queue empty 44071 1727204750.12916: checking for any_errors_fatal 44071 1727204750.12918: done checking for any_errors_fatal 44071 1727204750.12919: checking for max_fail_percentage 44071 1727204750.12920: done checking for max_fail_percentage 44071 1727204750.12921: checking to see if all hosts have failed and the running result is not ok 44071 1727204750.12922: done checking to see if all hosts have failed 44071 1727204750.12923: getting the remaining hosts for this loop 44071 1727204750.12924: done getting the remaining hosts for this loop 44071 1727204750.12927: getting the next task for host managed-node2 44071 1727204750.12934: done getting next task for host managed-node2 44071 1727204750.12937: ^ task is: TASK: Include the task 'get_profile_stat.yml' 44071 1727204750.12940: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204750.12943: getting variables 44071 1727204750.12951: in VariableManager get_vars() 44071 1727204750.12972: Calling all_inventory to load vars for managed-node2 44071 1727204750.12975: Calling groups_inventory to load vars for managed-node2 44071 1727204750.12978: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204750.12986: Calling all_plugins_play to load vars for managed-node2 44071 1727204750.12989: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204750.12992: Calling groups_plugins_play to load vars for managed-node2 44071 1727204750.14668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204750.16951: done with get_vars() 44071 1727204750.16994: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 15:05:50 -0400 (0:00:00.120) 0:02:42.487 ***** 44071 1727204750.17091: entering _queue_task() for managed-node2/include_tasks 44071 1727204750.17527: worker is 1 (out of 1 available) 44071 1727204750.17543: exiting _queue_task() for managed-node2/include_tasks 44071 1727204750.17558: done queuing things up, now waiting for results queue to drain 44071 1727204750.17560: waiting for pending results... 44071 1727204750.17987: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 44071 1727204750.18077: in run() - task 127b8e07-fff9-c964-7471-000000002804 44071 1727204750.18169: variable 'ansible_search_path' from source: unknown 44071 1727204750.18179: variable 'ansible_search_path' from source: unknown 44071 1727204750.18182: calling self._execute() 44071 1727204750.18288: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204750.18303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204750.18326: variable 'omit' from source: magic vars 44071 1727204750.18803: variable 'ansible_distribution_major_version' from source: facts 44071 1727204750.18827: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204750.18842: _execute() done 44071 1727204750.18850: dumping result to json 44071 1727204750.18858: done dumping result, returning 44071 1727204750.18944: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-c964-7471-000000002804] 44071 1727204750.18948: sending task result for task 127b8e07-fff9-c964-7471-000000002804 44071 1727204750.19045: done sending task result for task 127b8e07-fff9-c964-7471-000000002804 44071 1727204750.19050: WORKER PROCESS EXITING 44071 1727204750.19085: no more pending results, returning what we have 44071 1727204750.19173: in VariableManager get_vars() 44071 1727204750.19239: Calling all_inventory to load vars for managed-node2 44071 1727204750.19243: Calling groups_inventory to load vars for managed-node2 44071 1727204750.19247: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204750.19264: Calling all_plugins_play to load vars for managed-node2 44071 1727204750.19270: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204750.19274: Calling groups_plugins_play to load vars for managed-node2 44071 1727204750.21480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204750.26513: done with get_vars() 44071 1727204750.26556: variable 'ansible_search_path' from source: unknown 44071 1727204750.26557: variable 'ansible_search_path' from source: unknown 44071 1727204750.26572: variable 'item' from source: include params 44071 1727204750.26913: variable 'item' from source: include params 44071 1727204750.26951: we have included files to process 44071 1727204750.26952: generating all_blocks data 44071 1727204750.26954: done generating all_blocks data 44071 1727204750.26955: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204750.26956: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204750.26959: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 44071 1727204750.28447: done processing included file 44071 1727204750.28451: iterating over new_blocks loaded from include file 44071 1727204750.28452: in VariableManager get_vars() 44071 1727204750.28480: done with get_vars() 44071 1727204750.28483: filtering new block on tags 44071 1727204750.28580: done filtering new block on tags 44071 1727204750.28584: in VariableManager get_vars() 44071 1727204750.28606: done with get_vars() 44071 1727204750.28607: filtering new block on tags 44071 1727204750.28692: done filtering new block on tags 44071 1727204750.28695: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 44071 1727204750.28702: extending task lists for all hosts with included blocks 44071 1727204750.29034: done extending task lists 44071 1727204750.29036: done processing included files 44071 1727204750.29037: results queue empty 44071 1727204750.29037: checking for any_errors_fatal 44071 1727204750.29042: done checking for any_errors_fatal 44071 1727204750.29043: checking for max_fail_percentage 44071 1727204750.29044: done checking for max_fail_percentage 44071 1727204750.29045: checking to see if all hosts have failed and the running result is not ok 44071 1727204750.29046: done checking to see if all hosts have failed 44071 1727204750.29047: getting the remaining hosts for this loop 44071 1727204750.29048: done getting the remaining hosts for this loop 44071 1727204750.29051: getting the next task for host managed-node2 44071 1727204750.29057: done getting next task for host managed-node2 44071 1727204750.29067: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 44071 1727204750.29074: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204750.29078: getting variables 44071 1727204750.29079: in VariableManager get_vars() 44071 1727204750.29094: Calling all_inventory to load vars for managed-node2 44071 1727204750.29097: Calling groups_inventory to load vars for managed-node2 44071 1727204750.29100: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204750.29107: Calling all_plugins_play to load vars for managed-node2 44071 1727204750.29110: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204750.29113: Calling groups_plugins_play to load vars for managed-node2 44071 1727204750.30883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204750.33258: done with get_vars() 44071 1727204750.33303: done getting variables 44071 1727204750.33365: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:05:50 -0400 (0:00:00.163) 0:02:42.650 ***** 44071 1727204750.33406: entering _queue_task() for managed-node2/set_fact 44071 1727204750.34067: worker is 1 (out of 1 available) 44071 1727204750.34080: exiting _queue_task() for managed-node2/set_fact 44071 1727204750.34092: done queuing things up, now waiting for results queue to drain 44071 1727204750.34094: waiting for pending results... 44071 1727204750.34337: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 44071 1727204750.34406: in run() - task 127b8e07-fff9-c964-7471-000000002888 44071 1727204750.34439: variable 'ansible_search_path' from source: unknown 44071 1727204750.34492: variable 'ansible_search_path' from source: unknown 44071 1727204750.34498: calling self._execute() 44071 1727204750.34622: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204750.34635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204750.34659: variable 'omit' from source: magic vars 44071 1727204750.35123: variable 'ansible_distribution_major_version' from source: facts 44071 1727204750.35147: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204750.35193: variable 'omit' from source: magic vars 44071 1727204750.35235: variable 'omit' from source: magic vars 44071 1727204750.35281: variable 'omit' from source: magic vars 44071 1727204750.35337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204750.35389: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204750.35472: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204750.35476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204750.35478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204750.35502: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204750.35515: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204750.35526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204750.35651: Set connection var ansible_connection to ssh 44071 1727204750.35664: Set connection var ansible_timeout to 10 44071 1727204750.35678: Set connection var ansible_pipelining to False 44071 1727204750.35694: Set connection var ansible_shell_type to sh 44071 1727204750.35737: Set connection var ansible_shell_executable to /bin/sh 44071 1727204750.35740: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204750.35758: variable 'ansible_shell_executable' from source: unknown 44071 1727204750.35769: variable 'ansible_connection' from source: unknown 44071 1727204750.35779: variable 'ansible_module_compression' from source: unknown 44071 1727204750.35787: variable 'ansible_shell_type' from source: unknown 44071 1727204750.35846: variable 'ansible_shell_executable' from source: unknown 44071 1727204750.35849: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204750.35852: variable 'ansible_pipelining' from source: unknown 44071 1727204750.35854: variable 'ansible_timeout' from source: unknown 44071 1727204750.35856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204750.36005: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204750.36029: variable 'omit' from source: magic vars 44071 1727204750.36040: starting attempt loop 44071 1727204750.36048: running the handler 44071 1727204750.36077: handler run complete 44071 1727204750.36127: attempt loop complete, returning result 44071 1727204750.36130: _execute() done 44071 1727204750.36133: dumping result to json 44071 1727204750.36135: done dumping result, returning 44071 1727204750.36137: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-c964-7471-000000002888] 44071 1727204750.36139: sending task result for task 127b8e07-fff9-c964-7471-000000002888 ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 44071 1727204750.36442: no more pending results, returning what we have 44071 1727204750.36454: results queue empty 44071 1727204750.36455: checking for any_errors_fatal 44071 1727204750.36458: done checking for any_errors_fatal 44071 1727204750.36459: checking for max_fail_percentage 44071 1727204750.36461: done checking for max_fail_percentage 44071 1727204750.36462: checking to see if all hosts have failed and the running result is not ok 44071 1727204750.36463: done checking to see if all hosts have failed 44071 1727204750.36464: getting the remaining hosts for this loop 44071 1727204750.36467: done getting the remaining hosts for this loop 44071 1727204750.36473: getting the next task for host managed-node2 44071 1727204750.36484: done getting next task for host managed-node2 44071 1727204750.36487: ^ task is: TASK: Stat profile file 44071 1727204750.36494: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204750.36499: getting variables 44071 1727204750.36501: in VariableManager get_vars() 44071 1727204750.36679: Calling all_inventory to load vars for managed-node2 44071 1727204750.36682: Calling groups_inventory to load vars for managed-node2 44071 1727204750.36687: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204750.36694: done sending task result for task 127b8e07-fff9-c964-7471-000000002888 44071 1727204750.36698: WORKER PROCESS EXITING 44071 1727204750.36711: Calling all_plugins_play to load vars for managed-node2 44071 1727204750.36715: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204750.36719: Calling groups_plugins_play to load vars for managed-node2 44071 1727204750.47069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204750.49414: done with get_vars() 44071 1727204750.49465: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:05:50 -0400 (0:00:00.161) 0:02:42.812 ***** 44071 1727204750.49569: entering _queue_task() for managed-node2/stat 44071 1727204750.50000: worker is 1 (out of 1 available) 44071 1727204750.50017: exiting _queue_task() for managed-node2/stat 44071 1727204750.50031: done queuing things up, now waiting for results queue to drain 44071 1727204750.50034: waiting for pending results... 44071 1727204750.50440: running TaskExecutor() for managed-node2/TASK: Stat profile file 44071 1727204750.50556: in run() - task 127b8e07-fff9-c964-7471-000000002889 44071 1727204750.50582: variable 'ansible_search_path' from source: unknown 44071 1727204750.50591: variable 'ansible_search_path' from source: unknown 44071 1727204750.50644: calling self._execute() 44071 1727204750.50770: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204750.50785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204750.50799: variable 'omit' from source: magic vars 44071 1727204750.51270: variable 'ansible_distribution_major_version' from source: facts 44071 1727204750.51406: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204750.51409: variable 'omit' from source: magic vars 44071 1727204750.51413: variable 'omit' from source: magic vars 44071 1727204750.51519: variable 'profile' from source: play vars 44071 1727204750.51530: variable 'interface' from source: play vars 44071 1727204750.51612: variable 'interface' from source: play vars 44071 1727204750.51648: variable 'omit' from source: magic vars 44071 1727204750.51703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204750.51761: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204750.51794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204750.51818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204750.51845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204750.51886: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204750.51895: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204750.51903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204750.52028: Set connection var ansible_connection to ssh 44071 1727204750.52044: Set connection var ansible_timeout to 10 44071 1727204750.52167: Set connection var ansible_pipelining to False 44071 1727204750.52173: Set connection var ansible_shell_type to sh 44071 1727204750.52176: Set connection var ansible_shell_executable to /bin/sh 44071 1727204750.52180: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204750.52183: variable 'ansible_shell_executable' from source: unknown 44071 1727204750.52185: variable 'ansible_connection' from source: unknown 44071 1727204750.52188: variable 'ansible_module_compression' from source: unknown 44071 1727204750.52191: variable 'ansible_shell_type' from source: unknown 44071 1727204750.52194: variable 'ansible_shell_executable' from source: unknown 44071 1727204750.52196: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204750.52199: variable 'ansible_pipelining' from source: unknown 44071 1727204750.52203: variable 'ansible_timeout' from source: unknown 44071 1727204750.52206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204750.52496: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204750.52501: variable 'omit' from source: magic vars 44071 1727204750.52505: starting attempt loop 44071 1727204750.52507: running the handler 44071 1727204750.52604: _low_level_execute_command(): starting 44071 1727204750.52607: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204750.53429: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204750.53493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204750.53557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204750.53588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204750.53604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204750.53729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204750.55516: stdout chunk (state=3): >>>/root <<< 44071 1727204750.55633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204750.55695: stderr chunk (state=3): >>><<< 44071 1727204750.55699: stdout chunk (state=3): >>><<< 44071 1727204750.55720: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204750.55739: _low_level_execute_command(): starting 44071 1727204750.55746: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204750.5572228-53285-171318998410166 `" && echo ansible-tmp-1727204750.5572228-53285-171318998410166="` echo /root/.ansible/tmp/ansible-tmp-1727204750.5572228-53285-171318998410166 `" ) && sleep 0' 44071 1727204750.56248: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204750.56252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204750.56263: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204750.56268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204750.56310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204750.56317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204750.56319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204750.56393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204750.58376: stdout chunk (state=3): >>>ansible-tmp-1727204750.5572228-53285-171318998410166=/root/.ansible/tmp/ansible-tmp-1727204750.5572228-53285-171318998410166 <<< 44071 1727204750.58479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204750.58622: stderr chunk (state=3): >>><<< 44071 1727204750.58627: stdout chunk (state=3): >>><<< 44071 1727204750.58633: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204750.5572228-53285-171318998410166=/root/.ansible/tmp/ansible-tmp-1727204750.5572228-53285-171318998410166 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204750.58653: variable 'ansible_module_compression' from source: unknown 44071 1727204750.58723: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44071 1727204750.58770: variable 'ansible_facts' from source: unknown 44071 1727204750.58889: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204750.5572228-53285-171318998410166/AnsiballZ_stat.py 44071 1727204750.59123: Sending initial data 44071 1727204750.59126: Sent initial data (153 bytes) 44071 1727204750.59832: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204750.59856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204750.59869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204750.59905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204750.59989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204750.61598: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204750.61662: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204750.61725: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpcjfm1s4e /root/.ansible/tmp/ansible-tmp-1727204750.5572228-53285-171318998410166/AnsiballZ_stat.py <<< 44071 1727204750.61733: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204750.5572228-53285-171318998410166/AnsiballZ_stat.py" <<< 44071 1727204750.61795: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpcjfm1s4e" to remote "/root/.ansible/tmp/ansible-tmp-1727204750.5572228-53285-171318998410166/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204750.5572228-53285-171318998410166/AnsiballZ_stat.py" <<< 44071 1727204750.62814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204750.62818: stdout chunk (state=3): >>><<< 44071 1727204750.62822: stderr chunk (state=3): >>><<< 44071 1727204750.62852: done transferring module to remote 44071 1727204750.62864: _low_level_execute_command(): starting 44071 1727204750.62869: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204750.5572228-53285-171318998410166/ /root/.ansible/tmp/ansible-tmp-1727204750.5572228-53285-171318998410166/AnsiballZ_stat.py && sleep 0' 44071 1727204750.63372: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204750.63376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204750.63378: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204750.63381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204750.63431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204750.63435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204750.63437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204750.63517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204750.65333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204750.65389: stderr chunk (state=3): >>><<< 44071 1727204750.65393: stdout chunk (state=3): >>><<< 44071 1727204750.65411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204750.65414: _low_level_execute_command(): starting 44071 1727204750.65417: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204750.5572228-53285-171318998410166/AnsiballZ_stat.py && sleep 0' 44071 1727204750.65923: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204750.65927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204750.65933: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204750.65936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204750.65994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204750.65998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204750.66001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204750.66085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204750.82555: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44071 1727204750.83860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204750.83928: stderr chunk (state=3): >>><<< 44071 1727204750.83935: stdout chunk (state=3): >>><<< 44071 1727204750.83947: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204750.83977: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204750.5572228-53285-171318998410166/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204750.83989: _low_level_execute_command(): starting 44071 1727204750.83994: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204750.5572228-53285-171318998410166/ > /dev/null 2>&1 && sleep 0' 44071 1727204750.84499: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204750.84504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 44071 1727204750.84507: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204750.84509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204750.84573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204750.84583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204750.84586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204750.84653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204750.86618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204750.86677: stderr chunk (state=3): >>><<< 44071 1727204750.86682: stdout chunk (state=3): >>><<< 44071 1727204750.86695: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204750.86701: handler run complete 44071 1727204750.86720: attempt loop complete, returning result 44071 1727204750.86723: _execute() done 44071 1727204750.86726: dumping result to json 44071 1727204750.86729: done dumping result, returning 44071 1727204750.86743: done running TaskExecutor() for managed-node2/TASK: Stat profile file [127b8e07-fff9-c964-7471-000000002889] 44071 1727204750.86747: sending task result for task 127b8e07-fff9-c964-7471-000000002889 44071 1727204750.86853: done sending task result for task 127b8e07-fff9-c964-7471-000000002889 44071 1727204750.86856: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 44071 1727204750.86919: no more pending results, returning what we have 44071 1727204750.86922: results queue empty 44071 1727204750.86923: checking for any_errors_fatal 44071 1727204750.86932: done checking for any_errors_fatal 44071 1727204750.86933: checking for max_fail_percentage 44071 1727204750.86934: done checking for max_fail_percentage 44071 1727204750.86935: checking to see if all hosts have failed and the running result is not ok 44071 1727204750.86936: done checking to see if all hosts have failed 44071 1727204750.86936: getting the remaining hosts for this loop 44071 1727204750.86938: done getting the remaining hosts for this loop 44071 1727204750.86943: getting the next task for host managed-node2 44071 1727204750.86953: done getting next task for host managed-node2 44071 1727204750.86956: ^ task is: TASK: Set NM profile exist flag based on the profile files 44071 1727204750.86962: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204750.86977: getting variables 44071 1727204750.86978: in VariableManager get_vars() 44071 1727204750.87027: Calling all_inventory to load vars for managed-node2 44071 1727204750.87030: Calling groups_inventory to load vars for managed-node2 44071 1727204750.87033: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204750.87046: Calling all_plugins_play to load vars for managed-node2 44071 1727204750.87049: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204750.87051: Calling groups_plugins_play to load vars for managed-node2 44071 1727204750.88134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204750.89512: done with get_vars() 44071 1727204750.89540: done getting variables 44071 1727204750.89596: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:05:50 -0400 (0:00:00.400) 0:02:43.212 ***** 44071 1727204750.89624: entering _queue_task() for managed-node2/set_fact 44071 1727204750.89927: worker is 1 (out of 1 available) 44071 1727204750.89943: exiting _queue_task() for managed-node2/set_fact 44071 1727204750.89957: done queuing things up, now waiting for results queue to drain 44071 1727204750.89959: waiting for pending results... 44071 1727204750.90173: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 44071 1727204750.90301: in run() - task 127b8e07-fff9-c964-7471-00000000288a 44071 1727204750.90315: variable 'ansible_search_path' from source: unknown 44071 1727204750.90319: variable 'ansible_search_path' from source: unknown 44071 1727204750.90354: calling self._execute() 44071 1727204750.90450: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204750.90456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204750.90467: variable 'omit' from source: magic vars 44071 1727204750.90809: variable 'ansible_distribution_major_version' from source: facts 44071 1727204750.90824: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204750.90930: variable 'profile_stat' from source: set_fact 44071 1727204750.90942: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204750.90946: when evaluation is False, skipping this task 44071 1727204750.90949: _execute() done 44071 1727204750.90952: dumping result to json 44071 1727204750.90956: done dumping result, returning 44071 1727204750.90967: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-c964-7471-00000000288a] 44071 1727204750.90970: sending task result for task 127b8e07-fff9-c964-7471-00000000288a 44071 1727204750.91076: done sending task result for task 127b8e07-fff9-c964-7471-00000000288a 44071 1727204750.91079: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204750.91130: no more pending results, returning what we have 44071 1727204750.91134: results queue empty 44071 1727204750.91135: checking for any_errors_fatal 44071 1727204750.91145: done checking for any_errors_fatal 44071 1727204750.91146: checking for max_fail_percentage 44071 1727204750.91148: done checking for max_fail_percentage 44071 1727204750.91149: checking to see if all hosts have failed and the running result is not ok 44071 1727204750.91150: done checking to see if all hosts have failed 44071 1727204750.91150: getting the remaining hosts for this loop 44071 1727204750.91152: done getting the remaining hosts for this loop 44071 1727204750.91157: getting the next task for host managed-node2 44071 1727204750.91175: done getting next task for host managed-node2 44071 1727204750.91178: ^ task is: TASK: Get NM profile info 44071 1727204750.91183: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204750.91188: getting variables 44071 1727204750.91189: in VariableManager get_vars() 44071 1727204750.91241: Calling all_inventory to load vars for managed-node2 44071 1727204750.91244: Calling groups_inventory to load vars for managed-node2 44071 1727204750.91247: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204750.91261: Calling all_plugins_play to load vars for managed-node2 44071 1727204750.91264: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204750.91268: Calling groups_plugins_play to load vars for managed-node2 44071 1727204750.92345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204750.93598: done with get_vars() 44071 1727204750.93632: done getting variables 44071 1727204750.93690: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:05:50 -0400 (0:00:00.040) 0:02:43.253 ***** 44071 1727204750.93720: entering _queue_task() for managed-node2/shell 44071 1727204750.94030: worker is 1 (out of 1 available) 44071 1727204750.94045: exiting _queue_task() for managed-node2/shell 44071 1727204750.94059: done queuing things up, now waiting for results queue to drain 44071 1727204750.94060: waiting for pending results... 44071 1727204750.94268: running TaskExecutor() for managed-node2/TASK: Get NM profile info 44071 1727204750.94393: in run() - task 127b8e07-fff9-c964-7471-00000000288b 44071 1727204750.94410: variable 'ansible_search_path' from source: unknown 44071 1727204750.94415: variable 'ansible_search_path' from source: unknown 44071 1727204750.94449: calling self._execute() 44071 1727204750.94541: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204750.94547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204750.94558: variable 'omit' from source: magic vars 44071 1727204750.94900: variable 'ansible_distribution_major_version' from source: facts 44071 1727204750.94911: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204750.94918: variable 'omit' from source: magic vars 44071 1727204750.94963: variable 'omit' from source: magic vars 44071 1727204750.95046: variable 'profile' from source: play vars 44071 1727204750.95050: variable 'interface' from source: play vars 44071 1727204750.95105: variable 'interface' from source: play vars 44071 1727204750.95120: variable 'omit' from source: magic vars 44071 1727204750.95158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204750.95194: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204750.95213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204750.95227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204750.95241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204750.95268: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204750.95271: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204750.95274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204750.95353: Set connection var ansible_connection to ssh 44071 1727204750.95357: Set connection var ansible_timeout to 10 44071 1727204750.95363: Set connection var ansible_pipelining to False 44071 1727204750.95370: Set connection var ansible_shell_type to sh 44071 1727204750.95376: Set connection var ansible_shell_executable to /bin/sh 44071 1727204750.95386: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204750.95405: variable 'ansible_shell_executable' from source: unknown 44071 1727204750.95409: variable 'ansible_connection' from source: unknown 44071 1727204750.95412: variable 'ansible_module_compression' from source: unknown 44071 1727204750.95414: variable 'ansible_shell_type' from source: unknown 44071 1727204750.95417: variable 'ansible_shell_executable' from source: unknown 44071 1727204750.95419: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204750.95422: variable 'ansible_pipelining' from source: unknown 44071 1727204750.95425: variable 'ansible_timeout' from source: unknown 44071 1727204750.95432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204750.95548: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204750.95558: variable 'omit' from source: magic vars 44071 1727204750.95563: starting attempt loop 44071 1727204750.95567: running the handler 44071 1727204750.95577: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204750.95594: _low_level_execute_command(): starting 44071 1727204750.95601: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204750.96184: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204750.96189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204750.96193: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204750.96195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204750.96241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204750.96244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204750.96246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204750.96326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204750.98014: stdout chunk (state=3): >>>/root <<< 44071 1727204750.98105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204750.98173: stderr chunk (state=3): >>><<< 44071 1727204750.98177: stdout chunk (state=3): >>><<< 44071 1727204750.98200: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204750.98213: _low_level_execute_command(): starting 44071 1727204750.98223: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204750.9819956-53298-232654591789939 `" && echo ansible-tmp-1727204750.9819956-53298-232654591789939="` echo /root/.ansible/tmp/ansible-tmp-1727204750.9819956-53298-232654591789939 `" ) && sleep 0' 44071 1727204750.98734: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204750.98739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204750.98742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204750.98744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204750.98794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204750.98798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204750.98893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204751.00857: stdout chunk (state=3): >>>ansible-tmp-1727204750.9819956-53298-232654591789939=/root/.ansible/tmp/ansible-tmp-1727204750.9819956-53298-232654591789939 <<< 44071 1727204751.00987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204751.01039: stderr chunk (state=3): >>><<< 44071 1727204751.01043: stdout chunk (state=3): >>><<< 44071 1727204751.01059: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204750.9819956-53298-232654591789939=/root/.ansible/tmp/ansible-tmp-1727204750.9819956-53298-232654591789939 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204751.01090: variable 'ansible_module_compression' from source: unknown 44071 1727204751.01138: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204751.01176: variable 'ansible_facts' from source: unknown 44071 1727204751.01236: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204750.9819956-53298-232654591789939/AnsiballZ_command.py 44071 1727204751.01353: Sending initial data 44071 1727204751.01357: Sent initial data (156 bytes) 44071 1727204751.01839: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204751.01848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204751.01870: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204751.01873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204751.01933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204751.01941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204751.02009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204751.03621: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204751.03689: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204751.03757: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp87ejht05 /root/.ansible/tmp/ansible-tmp-1727204750.9819956-53298-232654591789939/AnsiballZ_command.py <<< 44071 1727204751.03764: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204750.9819956-53298-232654591789939/AnsiballZ_command.py" <<< 44071 1727204751.03831: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp87ejht05" to remote "/root/.ansible/tmp/ansible-tmp-1727204750.9819956-53298-232654591789939/AnsiballZ_command.py" <<< 44071 1727204751.03836: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204750.9819956-53298-232654591789939/AnsiballZ_command.py" <<< 44071 1727204751.04502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204751.04575: stderr chunk (state=3): >>><<< 44071 1727204751.04579: stdout chunk (state=3): >>><<< 44071 1727204751.04602: done transferring module to remote 44071 1727204751.04614: _low_level_execute_command(): starting 44071 1727204751.04619: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204750.9819956-53298-232654591789939/ /root/.ansible/tmp/ansible-tmp-1727204750.9819956-53298-232654591789939/AnsiballZ_command.py && sleep 0' 44071 1727204751.05112: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204751.05115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 44071 1727204751.05122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204751.05124: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204751.05126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204751.05172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204751.05185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204751.05271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204751.07105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204751.07159: stderr chunk (state=3): >>><<< 44071 1727204751.07164: stdout chunk (state=3): >>><<< 44071 1727204751.07180: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204751.07184: _low_level_execute_command(): starting 44071 1727204751.07190: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204750.9819956-53298-232654591789939/AnsiballZ_command.py && sleep 0' 44071 1727204751.07667: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204751.07672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204751.07700: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204751.07704: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204751.07706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204751.07708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204751.07767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204751.07771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204751.07786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204751.07861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204751.25952: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:05:51.240663", "end": "2024-09-24 15:05:51.257837", "delta": "0:00:00.017174", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204751.27392: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. <<< 44071 1727204751.27499: stderr chunk (state=3): >>><<< 44071 1727204751.27504: stdout chunk (state=3): >>><<< 44071 1727204751.27673: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-24 15:05:51.240663", "end": "2024-09-24 15:05:51.257837", "delta": "0:00:00.017174", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. 44071 1727204751.27679: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204750.9819956-53298-232654591789939/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204751.27686: _low_level_execute_command(): starting 44071 1727204751.27689: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204750.9819956-53298-232654591789939/ > /dev/null 2>&1 && sleep 0' 44071 1727204751.28373: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204751.28391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204751.28478: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204751.28500: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204751.28518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204751.28545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204751.28659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204751.31474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204751.31479: stdout chunk (state=3): >>><<< 44071 1727204751.31482: stderr chunk (state=3): >>><<< 44071 1727204751.31485: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204751.31487: handler run complete 44071 1727204751.31490: Evaluated conditional (False): False 44071 1727204751.31492: attempt loop complete, returning result 44071 1727204751.31494: _execute() done 44071 1727204751.31496: dumping result to json 44071 1727204751.31499: done dumping result, returning 44071 1727204751.31502: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [127b8e07-fff9-c964-7471-00000000288b] 44071 1727204751.31505: sending task result for task 127b8e07-fff9-c964-7471-00000000288b fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.017174", "end": "2024-09-24 15:05:51.257837", "rc": 1, "start": "2024-09-24 15:05:51.240663" } MSG: non-zero return code ...ignoring 44071 1727204751.31701: no more pending results, returning what we have 44071 1727204751.31705: results queue empty 44071 1727204751.31706: checking for any_errors_fatal 44071 1727204751.31718: done checking for any_errors_fatal 44071 1727204751.31718: checking for max_fail_percentage 44071 1727204751.31721: done checking for max_fail_percentage 44071 1727204751.31722: checking to see if all hosts have failed and the running result is not ok 44071 1727204751.31723: done checking to see if all hosts have failed 44071 1727204751.31724: getting the remaining hosts for this loop 44071 1727204751.31725: done getting the remaining hosts for this loop 44071 1727204751.31731: getting the next task for host managed-node2 44071 1727204751.31742: done getting next task for host managed-node2 44071 1727204751.31746: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44071 1727204751.31752: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204751.31756: getting variables 44071 1727204751.31758: in VariableManager get_vars() 44071 1727204751.31819: Calling all_inventory to load vars for managed-node2 44071 1727204751.31822: Calling groups_inventory to load vars for managed-node2 44071 1727204751.31826: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204751.31842: Calling all_plugins_play to load vars for managed-node2 44071 1727204751.31846: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204751.31850: Calling groups_plugins_play to load vars for managed-node2 44071 1727204751.32372: done sending task result for task 127b8e07-fff9-c964-7471-00000000288b 44071 1727204751.32378: WORKER PROCESS EXITING 44071 1727204751.34863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204751.37525: done with get_vars() 44071 1727204751.37559: done getting variables 44071 1727204751.37614: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:05:51 -0400 (0:00:00.439) 0:02:43.692 ***** 44071 1727204751.37643: entering _queue_task() for managed-node2/set_fact 44071 1727204751.37953: worker is 1 (out of 1 available) 44071 1727204751.37969: exiting _queue_task() for managed-node2/set_fact 44071 1727204751.37983: done queuing things up, now waiting for results queue to drain 44071 1727204751.37985: waiting for pending results... 44071 1727204751.38203: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 44071 1727204751.38316: in run() - task 127b8e07-fff9-c964-7471-00000000288c 44071 1727204751.38334: variable 'ansible_search_path' from source: unknown 44071 1727204751.38340: variable 'ansible_search_path' from source: unknown 44071 1727204751.38374: calling self._execute() 44071 1727204751.38462: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204751.38469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204751.38479: variable 'omit' from source: magic vars 44071 1727204751.38823: variable 'ansible_distribution_major_version' from source: facts 44071 1727204751.38837: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204751.38948: variable 'nm_profile_exists' from source: set_fact 44071 1727204751.38959: Evaluated conditional (nm_profile_exists.rc == 0): False 44071 1727204751.38962: when evaluation is False, skipping this task 44071 1727204751.38968: _execute() done 44071 1727204751.38971: dumping result to json 44071 1727204751.38973: done dumping result, returning 44071 1727204751.38982: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-c964-7471-00000000288c] 44071 1727204751.38989: sending task result for task 127b8e07-fff9-c964-7471-00000000288c 44071 1727204751.39097: done sending task result for task 127b8e07-fff9-c964-7471-00000000288c 44071 1727204751.39101: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 44071 1727204751.39152: no more pending results, returning what we have 44071 1727204751.39156: results queue empty 44071 1727204751.39157: checking for any_errors_fatal 44071 1727204751.39172: done checking for any_errors_fatal 44071 1727204751.39173: checking for max_fail_percentage 44071 1727204751.39174: done checking for max_fail_percentage 44071 1727204751.39175: checking to see if all hosts have failed and the running result is not ok 44071 1727204751.39176: done checking to see if all hosts have failed 44071 1727204751.39177: getting the remaining hosts for this loop 44071 1727204751.39178: done getting the remaining hosts for this loop 44071 1727204751.39183: getting the next task for host managed-node2 44071 1727204751.39195: done getting next task for host managed-node2 44071 1727204751.39198: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 44071 1727204751.39203: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204751.39209: getting variables 44071 1727204751.39210: in VariableManager get_vars() 44071 1727204751.39258: Calling all_inventory to load vars for managed-node2 44071 1727204751.39261: Calling groups_inventory to load vars for managed-node2 44071 1727204751.39271: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204751.39287: Calling all_plugins_play to load vars for managed-node2 44071 1727204751.39289: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204751.39292: Calling groups_plugins_play to load vars for managed-node2 44071 1727204751.41286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204751.42543: done with get_vars() 44071 1727204751.42579: done getting variables 44071 1727204751.42633: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204751.42738: variable 'profile' from source: play vars 44071 1727204751.42742: variable 'interface' from source: play vars 44071 1727204751.42791: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:05:51 -0400 (0:00:00.051) 0:02:43.744 ***** 44071 1727204751.42821: entering _queue_task() for managed-node2/command 44071 1727204751.43227: worker is 1 (out of 1 available) 44071 1727204751.43243: exiting _queue_task() for managed-node2/command 44071 1727204751.43258: done queuing things up, now waiting for results queue to drain 44071 1727204751.43260: waiting for pending results... 44071 1727204751.43562: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr 44071 1727204751.43738: in run() - task 127b8e07-fff9-c964-7471-00000000288e 44071 1727204751.43754: variable 'ansible_search_path' from source: unknown 44071 1727204751.43760: variable 'ansible_search_path' from source: unknown 44071 1727204751.43807: calling self._execute() 44071 1727204751.43946: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204751.43959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204751.43972: variable 'omit' from source: magic vars 44071 1727204751.44500: variable 'ansible_distribution_major_version' from source: facts 44071 1727204751.44505: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204751.44613: variable 'profile_stat' from source: set_fact 44071 1727204751.44632: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204751.44635: when evaluation is False, skipping this task 44071 1727204751.44639: _execute() done 44071 1727204751.44641: dumping result to json 44071 1727204751.44644: done dumping result, returning 44071 1727204751.44655: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-statebr [127b8e07-fff9-c964-7471-00000000288e] 44071 1727204751.44660: sending task result for task 127b8e07-fff9-c964-7471-00000000288e 44071 1727204751.44832: done sending task result for task 127b8e07-fff9-c964-7471-00000000288e 44071 1727204751.44835: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204751.44894: no more pending results, returning what we have 44071 1727204751.44898: results queue empty 44071 1727204751.44899: checking for any_errors_fatal 44071 1727204751.44909: done checking for any_errors_fatal 44071 1727204751.44909: checking for max_fail_percentage 44071 1727204751.44911: done checking for max_fail_percentage 44071 1727204751.44912: checking to see if all hosts have failed and the running result is not ok 44071 1727204751.44913: done checking to see if all hosts have failed 44071 1727204751.44914: getting the remaining hosts for this loop 44071 1727204751.44916: done getting the remaining hosts for this loop 44071 1727204751.44920: getting the next task for host managed-node2 44071 1727204751.44931: done getting next task for host managed-node2 44071 1727204751.44934: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 44071 1727204751.44940: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204751.44944: getting variables 44071 1727204751.44945: in VariableManager get_vars() 44071 1727204751.44993: Calling all_inventory to load vars for managed-node2 44071 1727204751.44996: Calling groups_inventory to load vars for managed-node2 44071 1727204751.45000: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204751.45014: Calling all_plugins_play to load vars for managed-node2 44071 1727204751.45017: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204751.45020: Calling groups_plugins_play to load vars for managed-node2 44071 1727204751.46684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204751.47925: done with get_vars() 44071 1727204751.47961: done getting variables 44071 1727204751.48016: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204751.48117: variable 'profile' from source: play vars 44071 1727204751.48120: variable 'interface' from source: play vars 44071 1727204751.48167: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:05:51 -0400 (0:00:00.053) 0:02:43.798 ***** 44071 1727204751.48195: entering _queue_task() for managed-node2/set_fact 44071 1727204751.48563: worker is 1 (out of 1 available) 44071 1727204751.48580: exiting _queue_task() for managed-node2/set_fact 44071 1727204751.48596: done queuing things up, now waiting for results queue to drain 44071 1727204751.48598: waiting for pending results... 44071 1727204751.48950: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 44071 1727204751.49122: in run() - task 127b8e07-fff9-c964-7471-00000000288f 44071 1727204751.49149: variable 'ansible_search_path' from source: unknown 44071 1727204751.49160: variable 'ansible_search_path' from source: unknown 44071 1727204751.49208: calling self._execute() 44071 1727204751.49377: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204751.49381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204751.49384: variable 'omit' from source: magic vars 44071 1727204751.49782: variable 'ansible_distribution_major_version' from source: facts 44071 1727204751.49794: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204751.49895: variable 'profile_stat' from source: set_fact 44071 1727204751.49905: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204751.49909: when evaluation is False, skipping this task 44071 1727204751.49914: _execute() done 44071 1727204751.49916: dumping result to json 44071 1727204751.49919: done dumping result, returning 44071 1727204751.49924: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [127b8e07-fff9-c964-7471-00000000288f] 44071 1727204751.49933: sending task result for task 127b8e07-fff9-c964-7471-00000000288f 44071 1727204751.50042: done sending task result for task 127b8e07-fff9-c964-7471-00000000288f 44071 1727204751.50045: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204751.50103: no more pending results, returning what we have 44071 1727204751.50108: results queue empty 44071 1727204751.50109: checking for any_errors_fatal 44071 1727204751.50119: done checking for any_errors_fatal 44071 1727204751.50120: checking for max_fail_percentage 44071 1727204751.50121: done checking for max_fail_percentage 44071 1727204751.50122: checking to see if all hosts have failed and the running result is not ok 44071 1727204751.50123: done checking to see if all hosts have failed 44071 1727204751.50124: getting the remaining hosts for this loop 44071 1727204751.50125: done getting the remaining hosts for this loop 44071 1727204751.50132: getting the next task for host managed-node2 44071 1727204751.50141: done getting next task for host managed-node2 44071 1727204751.50144: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 44071 1727204751.50149: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204751.50153: getting variables 44071 1727204751.50157: in VariableManager get_vars() 44071 1727204751.50205: Calling all_inventory to load vars for managed-node2 44071 1727204751.50208: Calling groups_inventory to load vars for managed-node2 44071 1727204751.50212: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204751.50226: Calling all_plugins_play to load vars for managed-node2 44071 1727204751.50231: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204751.50234: Calling groups_plugins_play to load vars for managed-node2 44071 1727204751.51482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204751.52724: done with get_vars() 44071 1727204751.52757: done getting variables 44071 1727204751.52814: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204751.52912: variable 'profile' from source: play vars 44071 1727204751.52916: variable 'interface' from source: play vars 44071 1727204751.52961: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:05:51 -0400 (0:00:00.047) 0:02:43.846 ***** 44071 1727204751.52992: entering _queue_task() for managed-node2/command 44071 1727204751.53306: worker is 1 (out of 1 available) 44071 1727204751.53322: exiting _queue_task() for managed-node2/command 44071 1727204751.53339: done queuing things up, now waiting for results queue to drain 44071 1727204751.53341: waiting for pending results... 44071 1727204751.53549: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr 44071 1727204751.53661: in run() - task 127b8e07-fff9-c964-7471-000000002890 44071 1727204751.53678: variable 'ansible_search_path' from source: unknown 44071 1727204751.53683: variable 'ansible_search_path' from source: unknown 44071 1727204751.53716: calling self._execute() 44071 1727204751.53815: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204751.53821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204751.53833: variable 'omit' from source: magic vars 44071 1727204751.54160: variable 'ansible_distribution_major_version' from source: facts 44071 1727204751.54173: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204751.54275: variable 'profile_stat' from source: set_fact 44071 1727204751.54284: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204751.54287: when evaluation is False, skipping this task 44071 1727204751.54290: _execute() done 44071 1727204751.54293: dumping result to json 44071 1727204751.54297: done dumping result, returning 44071 1727204751.54304: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-statebr [127b8e07-fff9-c964-7471-000000002890] 44071 1727204751.54309: sending task result for task 127b8e07-fff9-c964-7471-000000002890 44071 1727204751.54414: done sending task result for task 127b8e07-fff9-c964-7471-000000002890 44071 1727204751.54417: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204751.54475: no more pending results, returning what we have 44071 1727204751.54479: results queue empty 44071 1727204751.54481: checking for any_errors_fatal 44071 1727204751.54491: done checking for any_errors_fatal 44071 1727204751.54491: checking for max_fail_percentage 44071 1727204751.54494: done checking for max_fail_percentage 44071 1727204751.54495: checking to see if all hosts have failed and the running result is not ok 44071 1727204751.54495: done checking to see if all hosts have failed 44071 1727204751.54496: getting the remaining hosts for this loop 44071 1727204751.54498: done getting the remaining hosts for this loop 44071 1727204751.54503: getting the next task for host managed-node2 44071 1727204751.54513: done getting next task for host managed-node2 44071 1727204751.54516: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 44071 1727204751.54522: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204751.54527: getting variables 44071 1727204751.54530: in VariableManager get_vars() 44071 1727204751.54591: Calling all_inventory to load vars for managed-node2 44071 1727204751.54594: Calling groups_inventory to load vars for managed-node2 44071 1727204751.54598: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204751.54612: Calling all_plugins_play to load vars for managed-node2 44071 1727204751.54615: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204751.54618: Calling groups_plugins_play to load vars for managed-node2 44071 1727204751.55842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204751.57061: done with get_vars() 44071 1727204751.57093: done getting variables 44071 1727204751.57146: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204751.57245: variable 'profile' from source: play vars 44071 1727204751.57248: variable 'interface' from source: play vars 44071 1727204751.57296: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:05:51 -0400 (0:00:00.043) 0:02:43.889 ***** 44071 1727204751.57322: entering _queue_task() for managed-node2/set_fact 44071 1727204751.57634: worker is 1 (out of 1 available) 44071 1727204751.57650: exiting _queue_task() for managed-node2/set_fact 44071 1727204751.57668: done queuing things up, now waiting for results queue to drain 44071 1727204751.57669: waiting for pending results... 44071 1727204751.57883: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr 44071 1727204751.57992: in run() - task 127b8e07-fff9-c964-7471-000000002891 44071 1727204751.58006: variable 'ansible_search_path' from source: unknown 44071 1727204751.58010: variable 'ansible_search_path' from source: unknown 44071 1727204751.58048: calling self._execute() 44071 1727204751.58143: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204751.58147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204751.58158: variable 'omit' from source: magic vars 44071 1727204751.58493: variable 'ansible_distribution_major_version' from source: facts 44071 1727204751.58505: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204751.58606: variable 'profile_stat' from source: set_fact 44071 1727204751.58616: Evaluated conditional (profile_stat.stat.exists): False 44071 1727204751.58619: when evaluation is False, skipping this task 44071 1727204751.58622: _execute() done 44071 1727204751.58624: dumping result to json 44071 1727204751.58627: done dumping result, returning 44071 1727204751.58638: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-statebr [127b8e07-fff9-c964-7471-000000002891] 44071 1727204751.58643: sending task result for task 127b8e07-fff9-c964-7471-000000002891 44071 1727204751.58748: done sending task result for task 127b8e07-fff9-c964-7471-000000002891 44071 1727204751.58751: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 44071 1727204751.58818: no more pending results, returning what we have 44071 1727204751.58823: results queue empty 44071 1727204751.58824: checking for any_errors_fatal 44071 1727204751.58833: done checking for any_errors_fatal 44071 1727204751.58834: checking for max_fail_percentage 44071 1727204751.58836: done checking for max_fail_percentage 44071 1727204751.58837: checking to see if all hosts have failed and the running result is not ok 44071 1727204751.58838: done checking to see if all hosts have failed 44071 1727204751.58838: getting the remaining hosts for this loop 44071 1727204751.58840: done getting the remaining hosts for this loop 44071 1727204751.58845: getting the next task for host managed-node2 44071 1727204751.58856: done getting next task for host managed-node2 44071 1727204751.58860: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 44071 1727204751.58864: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204751.58872: getting variables 44071 1727204751.58873: in VariableManager get_vars() 44071 1727204751.58921: Calling all_inventory to load vars for managed-node2 44071 1727204751.58924: Calling groups_inventory to load vars for managed-node2 44071 1727204751.58928: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204751.58942: Calling all_plugins_play to load vars for managed-node2 44071 1727204751.58944: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204751.58947: Calling groups_plugins_play to load vars for managed-node2 44071 1727204751.60016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204751.61241: done with get_vars() 44071 1727204751.61278: done getting variables 44071 1727204751.61328: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204751.61430: variable 'profile' from source: play vars 44071 1727204751.61434: variable 'interface' from source: play vars 44071 1727204751.61481: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 15:05:51 -0400 (0:00:00.041) 0:02:43.931 ***** 44071 1727204751.61507: entering _queue_task() for managed-node2/assert 44071 1727204751.61815: worker is 1 (out of 1 available) 44071 1727204751.61831: exiting _queue_task() for managed-node2/assert 44071 1727204751.61846: done queuing things up, now waiting for results queue to drain 44071 1727204751.61848: waiting for pending results... 44071 1727204751.62057: running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'statebr' 44071 1727204751.62149: in run() - task 127b8e07-fff9-c964-7471-000000002805 44071 1727204751.62163: variable 'ansible_search_path' from source: unknown 44071 1727204751.62168: variable 'ansible_search_path' from source: unknown 44071 1727204751.62203: calling self._execute() 44071 1727204751.62297: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204751.62304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204751.62314: variable 'omit' from source: magic vars 44071 1727204751.62647: variable 'ansible_distribution_major_version' from source: facts 44071 1727204751.62659: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204751.62666: variable 'omit' from source: magic vars 44071 1727204751.62711: variable 'omit' from source: magic vars 44071 1727204751.62797: variable 'profile' from source: play vars 44071 1727204751.62802: variable 'interface' from source: play vars 44071 1727204751.62857: variable 'interface' from source: play vars 44071 1727204751.62876: variable 'omit' from source: magic vars 44071 1727204751.62914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204751.62950: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204751.62969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204751.62984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204751.62995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204751.63020: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204751.63023: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204751.63028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204751.63110: Set connection var ansible_connection to ssh 44071 1727204751.63116: Set connection var ansible_timeout to 10 44071 1727204751.63123: Set connection var ansible_pipelining to False 44071 1727204751.63128: Set connection var ansible_shell_type to sh 44071 1727204751.63136: Set connection var ansible_shell_executable to /bin/sh 44071 1727204751.63143: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204751.63168: variable 'ansible_shell_executable' from source: unknown 44071 1727204751.63171: variable 'ansible_connection' from source: unknown 44071 1727204751.63174: variable 'ansible_module_compression' from source: unknown 44071 1727204751.63176: variable 'ansible_shell_type' from source: unknown 44071 1727204751.63179: variable 'ansible_shell_executable' from source: unknown 44071 1727204751.63181: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204751.63186: variable 'ansible_pipelining' from source: unknown 44071 1727204751.63190: variable 'ansible_timeout' from source: unknown 44071 1727204751.63194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204751.63316: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204751.63326: variable 'omit' from source: magic vars 44071 1727204751.63334: starting attempt loop 44071 1727204751.63337: running the handler 44071 1727204751.63442: variable 'lsr_net_profile_exists' from source: set_fact 44071 1727204751.63446: Evaluated conditional (not lsr_net_profile_exists): True 44071 1727204751.63453: handler run complete 44071 1727204751.63468: attempt loop complete, returning result 44071 1727204751.63471: _execute() done 44071 1727204751.63474: dumping result to json 44071 1727204751.63477: done dumping result, returning 44071 1727204751.63484: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'statebr' [127b8e07-fff9-c964-7471-000000002805] 44071 1727204751.63488: sending task result for task 127b8e07-fff9-c964-7471-000000002805 44071 1727204751.63589: done sending task result for task 127b8e07-fff9-c964-7471-000000002805 44071 1727204751.63592: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204751.63657: no more pending results, returning what we have 44071 1727204751.63660: results queue empty 44071 1727204751.63661: checking for any_errors_fatal 44071 1727204751.63674: done checking for any_errors_fatal 44071 1727204751.63675: checking for max_fail_percentage 44071 1727204751.63677: done checking for max_fail_percentage 44071 1727204751.63678: checking to see if all hosts have failed and the running result is not ok 44071 1727204751.63679: done checking to see if all hosts have failed 44071 1727204751.63679: getting the remaining hosts for this loop 44071 1727204751.63681: done getting the remaining hosts for this loop 44071 1727204751.63686: getting the next task for host managed-node2 44071 1727204751.63698: done getting next task for host managed-node2 44071 1727204751.63702: ^ task is: TASK: Get NetworkManager RPM version 44071 1727204751.63707: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204751.63712: getting variables 44071 1727204751.63713: in VariableManager get_vars() 44071 1727204751.63759: Calling all_inventory to load vars for managed-node2 44071 1727204751.63762: Calling groups_inventory to load vars for managed-node2 44071 1727204751.63773: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204751.63786: Calling all_plugins_play to load vars for managed-node2 44071 1727204751.63788: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204751.63791: Calling groups_plugins_play to load vars for managed-node2 44071 1727204751.65034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204751.66229: done with get_vars() 44071 1727204751.66259: done getting variables 44071 1727204751.66312: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NetworkManager RPM version] ****************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:7 Tuesday 24 September 2024 15:05:51 -0400 (0:00:00.048) 0:02:43.979 ***** 44071 1727204751.66343: entering _queue_task() for managed-node2/command 44071 1727204751.66647: worker is 1 (out of 1 available) 44071 1727204751.66663: exiting _queue_task() for managed-node2/command 44071 1727204751.66680: done queuing things up, now waiting for results queue to drain 44071 1727204751.66682: waiting for pending results... 44071 1727204751.66895: running TaskExecutor() for managed-node2/TASK: Get NetworkManager RPM version 44071 1727204751.66997: in run() - task 127b8e07-fff9-c964-7471-000000002809 44071 1727204751.67013: variable 'ansible_search_path' from source: unknown 44071 1727204751.67017: variable 'ansible_search_path' from source: unknown 44071 1727204751.67053: calling self._execute() 44071 1727204751.67145: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204751.67152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204751.67161: variable 'omit' from source: magic vars 44071 1727204751.67490: variable 'ansible_distribution_major_version' from source: facts 44071 1727204751.67503: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204751.67509: variable 'omit' from source: magic vars 44071 1727204751.67553: variable 'omit' from source: magic vars 44071 1727204751.67587: variable 'omit' from source: magic vars 44071 1727204751.67625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204751.67659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204751.67680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204751.67696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204751.67707: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204751.67735: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204751.67738: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204751.67741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204751.67825: Set connection var ansible_connection to ssh 44071 1727204751.67828: Set connection var ansible_timeout to 10 44071 1727204751.67838: Set connection var ansible_pipelining to False 44071 1727204751.67843: Set connection var ansible_shell_type to sh 44071 1727204751.67849: Set connection var ansible_shell_executable to /bin/sh 44071 1727204751.67856: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204751.67877: variable 'ansible_shell_executable' from source: unknown 44071 1727204751.67880: variable 'ansible_connection' from source: unknown 44071 1727204751.67883: variable 'ansible_module_compression' from source: unknown 44071 1727204751.67886: variable 'ansible_shell_type' from source: unknown 44071 1727204751.67888: variable 'ansible_shell_executable' from source: unknown 44071 1727204751.67891: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204751.67895: variable 'ansible_pipelining' from source: unknown 44071 1727204751.67897: variable 'ansible_timeout' from source: unknown 44071 1727204751.67902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204751.68023: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204751.68032: variable 'omit' from source: magic vars 44071 1727204751.68039: starting attempt loop 44071 1727204751.68042: running the handler 44071 1727204751.68057: _low_level_execute_command(): starting 44071 1727204751.68064: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204751.68634: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204751.68639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204751.68642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204751.68645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204751.68694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204751.68698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204751.68786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204751.70554: stdout chunk (state=3): >>>/root <<< 44071 1727204751.70664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204751.70735: stderr chunk (state=3): >>><<< 44071 1727204751.70739: stdout chunk (state=3): >>><<< 44071 1727204751.70762: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204751.70776: _low_level_execute_command(): starting 44071 1727204751.70782: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204751.70761-53327-134405322989684 `" && echo ansible-tmp-1727204751.70761-53327-134405322989684="` echo /root/.ansible/tmp/ansible-tmp-1727204751.70761-53327-134405322989684 `" ) && sleep 0' 44071 1727204751.71271: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204751.71301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204751.71304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204751.71314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204751.71371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204751.71374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204751.71377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204751.71451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204751.73440: stdout chunk (state=3): >>>ansible-tmp-1727204751.70761-53327-134405322989684=/root/.ansible/tmp/ansible-tmp-1727204751.70761-53327-134405322989684 <<< 44071 1727204751.73550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204751.73617: stderr chunk (state=3): >>><<< 44071 1727204751.73621: stdout chunk (state=3): >>><<< 44071 1727204751.73640: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204751.70761-53327-134405322989684=/root/.ansible/tmp/ansible-tmp-1727204751.70761-53327-134405322989684 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204751.73673: variable 'ansible_module_compression' from source: unknown 44071 1727204751.73717: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204751.73752: variable 'ansible_facts' from source: unknown 44071 1727204751.73814: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204751.70761-53327-134405322989684/AnsiballZ_command.py 44071 1727204751.73927: Sending initial data 44071 1727204751.73933: Sent initial data (154 bytes) 44071 1727204751.74443: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204751.74447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204751.74449: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204751.74452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204751.74503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204751.74507: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204751.74514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204751.74582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204751.76176: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204751.76242: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204751.76312: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpwpwctfa3 /root/.ansible/tmp/ansible-tmp-1727204751.70761-53327-134405322989684/AnsiballZ_command.py <<< 44071 1727204751.76316: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204751.70761-53327-134405322989684/AnsiballZ_command.py" <<< 44071 1727204751.76380: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpwpwctfa3" to remote "/root/.ansible/tmp/ansible-tmp-1727204751.70761-53327-134405322989684/AnsiballZ_command.py" <<< 44071 1727204751.76389: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204751.70761-53327-134405322989684/AnsiballZ_command.py" <<< 44071 1727204751.77042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204751.77120: stderr chunk (state=3): >>><<< 44071 1727204751.77123: stdout chunk (state=3): >>><<< 44071 1727204751.77150: done transferring module to remote 44071 1727204751.77161: _low_level_execute_command(): starting 44071 1727204751.77164: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204751.70761-53327-134405322989684/ /root/.ansible/tmp/ansible-tmp-1727204751.70761-53327-134405322989684/AnsiballZ_command.py && sleep 0' 44071 1727204751.77629: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204751.77663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204751.77670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204751.77673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204751.77676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204751.77678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204751.77735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204751.77740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204751.77742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204751.77814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204751.79655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204751.79714: stderr chunk (state=3): >>><<< 44071 1727204751.79717: stdout chunk (state=3): >>><<< 44071 1727204751.79738: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204751.79742: _low_level_execute_command(): starting 44071 1727204751.79744: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204751.70761-53327-134405322989684/AnsiballZ_command.py && sleep 0' 44071 1727204751.80245: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204751.80249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204751.80251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204751.80253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204751.80256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204751.80314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204751.80318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204751.80325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204751.80403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204752.28197: stdout chunk (state=3): >>> {"changed": true, "stdout": "NetworkManager-1.46.2-1.fc40", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-24 15:05:51.967502", "end": "2024-09-24 15:05:52.280356", "delta": "0:00:00.312854", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204752.29896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204752.29952: stderr chunk (state=3): >>><<< 44071 1727204752.29956: stdout chunk (state=3): >>><<< 44071 1727204752.29980: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "NetworkManager-1.46.2-1.fc40", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-24 15:05:51.967502", "end": "2024-09-24 15:05:52.280356", "delta": "0:00:00.312854", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204752.30017: done with _execute_module (ansible.legacy.command, {'_raw_params': "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204751.70761-53327-134405322989684/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204752.30024: _low_level_execute_command(): starting 44071 1727204752.30032: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204751.70761-53327-134405322989684/ > /dev/null 2>&1 && sleep 0' 44071 1727204752.30537: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204752.30541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204752.30550: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204752.30552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204752.30601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204752.30604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204752.30607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204752.30685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204752.32619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204752.32680: stderr chunk (state=3): >>><<< 44071 1727204752.32684: stdout chunk (state=3): >>><<< 44071 1727204752.32699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204752.32707: handler run complete 44071 1727204752.32728: Evaluated conditional (False): False 44071 1727204752.32739: attempt loop complete, returning result 44071 1727204752.32742: _execute() done 44071 1727204752.32744: dumping result to json 44071 1727204752.32751: done dumping result, returning 44071 1727204752.32759: done running TaskExecutor() for managed-node2/TASK: Get NetworkManager RPM version [127b8e07-fff9-c964-7471-000000002809] 44071 1727204752.32763: sending task result for task 127b8e07-fff9-c964-7471-000000002809 44071 1727204752.32880: done sending task result for task 127b8e07-fff9-c964-7471-000000002809 44071 1727204752.32882: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager" ], "delta": "0:00:00.312854", "end": "2024-09-24 15:05:52.280356", "rc": 0, "start": "2024-09-24 15:05:51.967502" } STDOUT: NetworkManager-1.46.2-1.fc40 44071 1727204752.32965: no more pending results, returning what we have 44071 1727204752.32977: results queue empty 44071 1727204752.32978: checking for any_errors_fatal 44071 1727204752.32987: done checking for any_errors_fatal 44071 1727204752.32988: checking for max_fail_percentage 44071 1727204752.32989: done checking for max_fail_percentage 44071 1727204752.32990: checking to see if all hosts have failed and the running result is not ok 44071 1727204752.32991: done checking to see if all hosts have failed 44071 1727204752.32992: getting the remaining hosts for this loop 44071 1727204752.32993: done getting the remaining hosts for this loop 44071 1727204752.32998: getting the next task for host managed-node2 44071 1727204752.33007: done getting next task for host managed-node2 44071 1727204752.33010: ^ task is: TASK: Store NetworkManager version 44071 1727204752.33014: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204752.33018: getting variables 44071 1727204752.33020: in VariableManager get_vars() 44071 1727204752.33068: Calling all_inventory to load vars for managed-node2 44071 1727204752.33071: Calling groups_inventory to load vars for managed-node2 44071 1727204752.33075: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204752.33095: Calling all_plugins_play to load vars for managed-node2 44071 1727204752.33098: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204752.33101: Calling groups_plugins_play to load vars for managed-node2 44071 1727204752.34318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204752.35544: done with get_vars() 44071 1727204752.35581: done getting variables 44071 1727204752.35634: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Store NetworkManager version] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:14 Tuesday 24 September 2024 15:05:52 -0400 (0:00:00.693) 0:02:44.673 ***** 44071 1727204752.35661: entering _queue_task() for managed-node2/set_fact 44071 1727204752.35980: worker is 1 (out of 1 available) 44071 1727204752.35995: exiting _queue_task() for managed-node2/set_fact 44071 1727204752.36010: done queuing things up, now waiting for results queue to drain 44071 1727204752.36012: waiting for pending results... 44071 1727204752.36222: running TaskExecutor() for managed-node2/TASK: Store NetworkManager version 44071 1727204752.36320: in run() - task 127b8e07-fff9-c964-7471-00000000280a 44071 1727204752.36337: variable 'ansible_search_path' from source: unknown 44071 1727204752.36340: variable 'ansible_search_path' from source: unknown 44071 1727204752.36377: calling self._execute() 44071 1727204752.36475: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204752.36479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204752.36490: variable 'omit' from source: magic vars 44071 1727204752.36829: variable 'ansible_distribution_major_version' from source: facts 44071 1727204752.36843: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204752.36849: variable 'omit' from source: magic vars 44071 1727204752.36891: variable 'omit' from source: magic vars 44071 1727204752.36983: variable '__rpm_q_networkmanager' from source: set_fact 44071 1727204752.37022: variable 'omit' from source: magic vars 44071 1727204752.37118: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204752.37122: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204752.37127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204752.37274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204752.37278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204752.37281: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204752.37284: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204752.37288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204752.37406: Set connection var ansible_connection to ssh 44071 1727204752.37409: Set connection var ansible_timeout to 10 44071 1727204752.37412: Set connection var ansible_pipelining to False 44071 1727204752.37415: Set connection var ansible_shell_type to sh 44071 1727204752.37418: Set connection var ansible_shell_executable to /bin/sh 44071 1727204752.37420: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204752.37422: variable 'ansible_shell_executable' from source: unknown 44071 1727204752.37425: variable 'ansible_connection' from source: unknown 44071 1727204752.37427: variable 'ansible_module_compression' from source: unknown 44071 1727204752.37429: variable 'ansible_shell_type' from source: unknown 44071 1727204752.37431: variable 'ansible_shell_executable' from source: unknown 44071 1727204752.37433: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204752.37435: variable 'ansible_pipelining' from source: unknown 44071 1727204752.37437: variable 'ansible_timeout' from source: unknown 44071 1727204752.37439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204752.37620: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204752.37625: variable 'omit' from source: magic vars 44071 1727204752.37628: starting attempt loop 44071 1727204752.37631: running the handler 44071 1727204752.37633: handler run complete 44071 1727204752.37635: attempt loop complete, returning result 44071 1727204752.37637: _execute() done 44071 1727204752.37640: dumping result to json 44071 1727204752.37642: done dumping result, returning 44071 1727204752.37645: done running TaskExecutor() for managed-node2/TASK: Store NetworkManager version [127b8e07-fff9-c964-7471-00000000280a] 44071 1727204752.37647: sending task result for task 127b8e07-fff9-c964-7471-00000000280a ok: [managed-node2] => { "ansible_facts": { "networkmanager_nvr": "NetworkManager-1.46.2-1.fc40" }, "changed": false } 44071 1727204752.37897: no more pending results, returning what we have 44071 1727204752.37900: results queue empty 44071 1727204752.37901: checking for any_errors_fatal 44071 1727204752.37909: done checking for any_errors_fatal 44071 1727204752.37910: checking for max_fail_percentage 44071 1727204752.37911: done checking for max_fail_percentage 44071 1727204752.37912: checking to see if all hosts have failed and the running result is not ok 44071 1727204752.37913: done checking to see if all hosts have failed 44071 1727204752.37914: getting the remaining hosts for this loop 44071 1727204752.37915: done getting the remaining hosts for this loop 44071 1727204752.37919: getting the next task for host managed-node2 44071 1727204752.37927: done getting next task for host managed-node2 44071 1727204752.37932: ^ task is: TASK: Show NetworkManager version 44071 1727204752.37936: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204752.37940: getting variables 44071 1727204752.37990: in VariableManager get_vars() 44071 1727204752.38032: Calling all_inventory to load vars for managed-node2 44071 1727204752.38035: Calling groups_inventory to load vars for managed-node2 44071 1727204752.38038: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204752.38046: done sending task result for task 127b8e07-fff9-c964-7471-00000000280a 44071 1727204752.38054: WORKER PROCESS EXITING 44071 1727204752.38070: Calling all_plugins_play to load vars for managed-node2 44071 1727204752.38074: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204752.38079: Calling groups_plugins_play to load vars for managed-node2 44071 1727204752.40057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204752.42339: done with get_vars() 44071 1727204752.42381: done getting variables 44071 1727204752.42433: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show NetworkManager version] ********************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:18 Tuesday 24 September 2024 15:05:52 -0400 (0:00:00.067) 0:02:44.741 ***** 44071 1727204752.42463: entering _queue_task() for managed-node2/debug 44071 1727204752.42771: worker is 1 (out of 1 available) 44071 1727204752.42785: exiting _queue_task() for managed-node2/debug 44071 1727204752.42799: done queuing things up, now waiting for results queue to drain 44071 1727204752.42801: waiting for pending results... 44071 1727204752.43017: running TaskExecutor() for managed-node2/TASK: Show NetworkManager version 44071 1727204752.43126: in run() - task 127b8e07-fff9-c964-7471-00000000280b 44071 1727204752.43143: variable 'ansible_search_path' from source: unknown 44071 1727204752.43147: variable 'ansible_search_path' from source: unknown 44071 1727204752.43181: calling self._execute() 44071 1727204752.43275: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204752.43281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204752.43290: variable 'omit' from source: magic vars 44071 1727204752.43606: variable 'ansible_distribution_major_version' from source: facts 44071 1727204752.43620: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204752.43626: variable 'omit' from source: magic vars 44071 1727204752.43672: variable 'omit' from source: magic vars 44071 1727204752.43703: variable 'omit' from source: magic vars 44071 1727204752.43742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204752.43774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204752.43794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204752.43810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204752.43821: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204752.43847: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204752.43851: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204752.43853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204752.43935: Set connection var ansible_connection to ssh 44071 1727204752.43941: Set connection var ansible_timeout to 10 44071 1727204752.43947: Set connection var ansible_pipelining to False 44071 1727204752.43952: Set connection var ansible_shell_type to sh 44071 1727204752.43958: Set connection var ansible_shell_executable to /bin/sh 44071 1727204752.43967: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204752.43986: variable 'ansible_shell_executable' from source: unknown 44071 1727204752.43989: variable 'ansible_connection' from source: unknown 44071 1727204752.43993: variable 'ansible_module_compression' from source: unknown 44071 1727204752.43995: variable 'ansible_shell_type' from source: unknown 44071 1727204752.43998: variable 'ansible_shell_executable' from source: unknown 44071 1727204752.44003: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204752.44005: variable 'ansible_pipelining' from source: unknown 44071 1727204752.44008: variable 'ansible_timeout' from source: unknown 44071 1727204752.44011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204752.44135: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204752.44139: variable 'omit' from source: magic vars 44071 1727204752.44142: starting attempt loop 44071 1727204752.44144: running the handler 44071 1727204752.44187: variable 'networkmanager_nvr' from source: set_fact 44071 1727204752.44257: variable 'networkmanager_nvr' from source: set_fact 44071 1727204752.44268: handler run complete 44071 1727204752.44284: attempt loop complete, returning result 44071 1727204752.44287: _execute() done 44071 1727204752.44290: dumping result to json 44071 1727204752.44292: done dumping result, returning 44071 1727204752.44299: done running TaskExecutor() for managed-node2/TASK: Show NetworkManager version [127b8e07-fff9-c964-7471-00000000280b] 44071 1727204752.44305: sending task result for task 127b8e07-fff9-c964-7471-00000000280b 44071 1727204752.44406: done sending task result for task 127b8e07-fff9-c964-7471-00000000280b 44071 1727204752.44408: WORKER PROCESS EXITING ok: [managed-node2] => { "networkmanager_nvr": "NetworkManager-1.46.2-1.fc40" } 44071 1727204752.44468: no more pending results, returning what we have 44071 1727204752.44472: results queue empty 44071 1727204752.44473: checking for any_errors_fatal 44071 1727204752.44483: done checking for any_errors_fatal 44071 1727204752.44484: checking for max_fail_percentage 44071 1727204752.44486: done checking for max_fail_percentage 44071 1727204752.44487: checking to see if all hosts have failed and the running result is not ok 44071 1727204752.44488: done checking to see if all hosts have failed 44071 1727204752.44488: getting the remaining hosts for this loop 44071 1727204752.44490: done getting the remaining hosts for this loop 44071 1727204752.44495: getting the next task for host managed-node2 44071 1727204752.44506: done getting next task for host managed-node2 44071 1727204752.44510: ^ task is: TASK: Conditional asserts 44071 1727204752.44513: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204752.44517: getting variables 44071 1727204752.44526: in VariableManager get_vars() 44071 1727204752.44579: Calling all_inventory to load vars for managed-node2 44071 1727204752.44582: Calling groups_inventory to load vars for managed-node2 44071 1727204752.44585: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204752.44597: Calling all_plugins_play to load vars for managed-node2 44071 1727204752.44600: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204752.44603: Calling groups_plugins_play to load vars for managed-node2 44071 1727204752.45806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204752.47069: done with get_vars() 44071 1727204752.47099: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 15:05:52 -0400 (0:00:00.047) 0:02:44.788 ***** 44071 1727204752.47186: entering _queue_task() for managed-node2/include_tasks 44071 1727204752.47504: worker is 1 (out of 1 available) 44071 1727204752.47520: exiting _queue_task() for managed-node2/include_tasks 44071 1727204752.47537: done queuing things up, now waiting for results queue to drain 44071 1727204752.47539: waiting for pending results... 44071 1727204752.47751: running TaskExecutor() for managed-node2/TASK: Conditional asserts 44071 1727204752.47850: in run() - task 127b8e07-fff9-c964-7471-0000000020b3 44071 1727204752.47863: variable 'ansible_search_path' from source: unknown 44071 1727204752.47868: variable 'ansible_search_path' from source: unknown 44071 1727204752.48119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 44071 1727204752.50444: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 44071 1727204752.50507: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 44071 1727204752.50546: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 44071 1727204752.50583: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 44071 1727204752.50608: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 44071 1727204752.50692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 44071 1727204752.50716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 44071 1727204752.50738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 44071 1727204752.50768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 44071 1727204752.50779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 44071 1727204752.50870: variable 'lsr_assert_when' from source: include params 44071 1727204752.50964: variable 'network_provider' from source: set_fact 44071 1727204752.51025: variable 'omit' from source: magic vars 44071 1727204752.51123: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204752.51132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204752.51146: variable 'omit' from source: magic vars 44071 1727204752.51299: variable 'ansible_distribution_major_version' from source: facts 44071 1727204752.51308: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204752.51396: variable 'item' from source: unknown 44071 1727204752.51403: Evaluated conditional (item['condition']): True 44071 1727204752.51464: variable 'item' from source: unknown 44071 1727204752.51496: variable 'item' from source: unknown 44071 1727204752.51549: variable 'item' from source: unknown 44071 1727204752.51715: dumping result to json 44071 1727204752.51718: done dumping result, returning 44071 1727204752.51720: done running TaskExecutor() for managed-node2/TASK: Conditional asserts [127b8e07-fff9-c964-7471-0000000020b3] 44071 1727204752.51722: sending task result for task 127b8e07-fff9-c964-7471-0000000020b3 44071 1727204752.51773: done sending task result for task 127b8e07-fff9-c964-7471-0000000020b3 44071 1727204752.51800: no more pending results, returning what we have 44071 1727204752.51806: in VariableManager get_vars() 44071 1727204752.51858: Calling all_inventory to load vars for managed-node2 44071 1727204752.51861: Calling groups_inventory to load vars for managed-node2 44071 1727204752.51867: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204752.51887: Calling all_plugins_play to load vars for managed-node2 44071 1727204752.51890: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204752.51895: WORKER PROCESS EXITING 44071 1727204752.51899: Calling groups_plugins_play to load vars for managed-node2 44071 1727204752.53808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204752.56002: done with get_vars() 44071 1727204752.56042: variable 'ansible_search_path' from source: unknown 44071 1727204752.56044: variable 'ansible_search_path' from source: unknown 44071 1727204752.56094: we have included files to process 44071 1727204752.56095: generating all_blocks data 44071 1727204752.56097: done generating all_blocks data 44071 1727204752.56104: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44071 1727204752.56106: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44071 1727204752.56108: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 44071 1727204752.56229: in VariableManager get_vars() 44071 1727204752.56255: done with get_vars() 44071 1727204752.56378: done processing included file 44071 1727204752.56381: iterating over new_blocks loaded from include file 44071 1727204752.56382: in VariableManager get_vars() 44071 1727204752.56407: done with get_vars() 44071 1727204752.56409: filtering new block on tags 44071 1727204752.56449: done filtering new block on tags 44071 1727204752.56452: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 44071 1727204752.56458: extending task lists for all hosts with included blocks 44071 1727204752.57989: done extending task lists 44071 1727204752.57992: done processing included files 44071 1727204752.57993: results queue empty 44071 1727204752.57994: checking for any_errors_fatal 44071 1727204752.57998: done checking for any_errors_fatal 44071 1727204752.57999: checking for max_fail_percentage 44071 1727204752.58000: done checking for max_fail_percentage 44071 1727204752.58001: checking to see if all hosts have failed and the running result is not ok 44071 1727204752.58002: done checking to see if all hosts have failed 44071 1727204752.58003: getting the remaining hosts for this loop 44071 1727204752.58004: done getting the remaining hosts for this loop 44071 1727204752.58007: getting the next task for host managed-node2 44071 1727204752.58013: done getting next task for host managed-node2 44071 1727204752.58015: ^ task is: TASK: Include the task 'get_interface_stat.yml' 44071 1727204752.58018: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204752.58027: getting variables 44071 1727204752.58028: in VariableManager get_vars() 44071 1727204752.58142: Calling all_inventory to load vars for managed-node2 44071 1727204752.58146: Calling groups_inventory to load vars for managed-node2 44071 1727204752.58149: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204752.58158: Calling all_plugins_play to load vars for managed-node2 44071 1727204752.58160: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204752.58163: Calling groups_plugins_play to load vars for managed-node2 44071 1727204752.59652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204752.61898: done with get_vars() 44071 1727204752.61929: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:05:52 -0400 (0:00:00.148) 0:02:44.936 ***** 44071 1727204752.62022: entering _queue_task() for managed-node2/include_tasks 44071 1727204752.62436: worker is 1 (out of 1 available) 44071 1727204752.62452: exiting _queue_task() for managed-node2/include_tasks 44071 1727204752.62471: done queuing things up, now waiting for results queue to drain 44071 1727204752.62473: waiting for pending results... 44071 1727204752.62763: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 44071 1727204752.62919: in run() - task 127b8e07-fff9-c964-7471-0000000028d3 44071 1727204752.62940: variable 'ansible_search_path' from source: unknown 44071 1727204752.62948: variable 'ansible_search_path' from source: unknown 44071 1727204752.63000: calling self._execute() 44071 1727204752.63219: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204752.63222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204752.63225: variable 'omit' from source: magic vars 44071 1727204752.63579: variable 'ansible_distribution_major_version' from source: facts 44071 1727204752.63599: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204752.63610: _execute() done 44071 1727204752.63620: dumping result to json 44071 1727204752.63627: done dumping result, returning 44071 1727204752.63637: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-c964-7471-0000000028d3] 44071 1727204752.63648: sending task result for task 127b8e07-fff9-c964-7471-0000000028d3 44071 1727204752.63799: no more pending results, returning what we have 44071 1727204752.63805: in VariableManager get_vars() 44071 1727204752.63872: Calling all_inventory to load vars for managed-node2 44071 1727204752.63876: Calling groups_inventory to load vars for managed-node2 44071 1727204752.63880: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204752.63898: Calling all_plugins_play to load vars for managed-node2 44071 1727204752.63901: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204752.63904: Calling groups_plugins_play to load vars for managed-node2 44071 1727204752.64786: done sending task result for task 127b8e07-fff9-c964-7471-0000000028d3 44071 1727204752.64791: WORKER PROCESS EXITING 44071 1727204752.65961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204752.68062: done with get_vars() 44071 1727204752.68099: variable 'ansible_search_path' from source: unknown 44071 1727204752.68100: variable 'ansible_search_path' from source: unknown 44071 1727204752.68260: variable 'item' from source: include params 44071 1727204752.68303: we have included files to process 44071 1727204752.68304: generating all_blocks data 44071 1727204752.68306: done generating all_blocks data 44071 1727204752.68308: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204752.68309: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204752.68312: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 44071 1727204752.68519: done processing included file 44071 1727204752.68521: iterating over new_blocks loaded from include file 44071 1727204752.68523: in VariableManager get_vars() 44071 1727204752.68545: done with get_vars() 44071 1727204752.68548: filtering new block on tags 44071 1727204752.68579: done filtering new block on tags 44071 1727204752.68582: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 44071 1727204752.68588: extending task lists for all hosts with included blocks 44071 1727204752.68767: done extending task lists 44071 1727204752.68768: done processing included files 44071 1727204752.68769: results queue empty 44071 1727204752.68770: checking for any_errors_fatal 44071 1727204752.68775: done checking for any_errors_fatal 44071 1727204752.68776: checking for max_fail_percentage 44071 1727204752.68777: done checking for max_fail_percentage 44071 1727204752.68778: checking to see if all hosts have failed and the running result is not ok 44071 1727204752.68779: done checking to see if all hosts have failed 44071 1727204752.68779: getting the remaining hosts for this loop 44071 1727204752.68781: done getting the remaining hosts for this loop 44071 1727204752.68784: getting the next task for host managed-node2 44071 1727204752.68788: done getting next task for host managed-node2 44071 1727204752.68791: ^ task is: TASK: Get stat for interface {{ interface }} 44071 1727204752.68794: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204752.68796: getting variables 44071 1727204752.68797: in VariableManager get_vars() 44071 1727204752.68812: Calling all_inventory to load vars for managed-node2 44071 1727204752.68815: Calling groups_inventory to load vars for managed-node2 44071 1727204752.68817: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204752.68824: Calling all_plugins_play to load vars for managed-node2 44071 1727204752.68826: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204752.68829: Calling groups_plugins_play to load vars for managed-node2 44071 1727204752.78396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204752.80636: done with get_vars() 44071 1727204752.80679: done getting variables 44071 1727204752.80831: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:05:52 -0400 (0:00:00.188) 0:02:45.125 ***** 44071 1727204752.80863: entering _queue_task() for managed-node2/stat 44071 1727204752.81286: worker is 1 (out of 1 available) 44071 1727204752.81303: exiting _queue_task() for managed-node2/stat 44071 1727204752.81319: done queuing things up, now waiting for results queue to drain 44071 1727204752.81321: waiting for pending results... 44071 1727204752.81662: running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr 44071 1727204752.81863: in run() - task 127b8e07-fff9-c964-7471-000000002979 44071 1727204752.81890: variable 'ansible_search_path' from source: unknown 44071 1727204752.81904: variable 'ansible_search_path' from source: unknown 44071 1727204752.81952: calling self._execute() 44071 1727204752.82081: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204752.82095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204752.82168: variable 'omit' from source: magic vars 44071 1727204752.82573: variable 'ansible_distribution_major_version' from source: facts 44071 1727204752.82599: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204752.82616: variable 'omit' from source: magic vars 44071 1727204752.82686: variable 'omit' from source: magic vars 44071 1727204752.82817: variable 'interface' from source: play vars 44071 1727204752.82847: variable 'omit' from source: magic vars 44071 1727204752.82937: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204752.82954: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204752.82983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204752.83008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204752.83026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204752.83151: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204752.83156: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204752.83159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204752.83206: Set connection var ansible_connection to ssh 44071 1727204752.83222: Set connection var ansible_timeout to 10 44071 1727204752.83234: Set connection var ansible_pipelining to False 44071 1727204752.83244: Set connection var ansible_shell_type to sh 44071 1727204752.83254: Set connection var ansible_shell_executable to /bin/sh 44071 1727204752.83268: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204752.83371: variable 'ansible_shell_executable' from source: unknown 44071 1727204752.83375: variable 'ansible_connection' from source: unknown 44071 1727204752.83380: variable 'ansible_module_compression' from source: unknown 44071 1727204752.83382: variable 'ansible_shell_type' from source: unknown 44071 1727204752.83385: variable 'ansible_shell_executable' from source: unknown 44071 1727204752.83387: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204752.83389: variable 'ansible_pipelining' from source: unknown 44071 1727204752.83391: variable 'ansible_timeout' from source: unknown 44071 1727204752.83393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204752.83581: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 44071 1727204752.83600: variable 'omit' from source: magic vars 44071 1727204752.83616: starting attempt loop 44071 1727204752.83726: running the handler 44071 1727204752.83729: _low_level_execute_command(): starting 44071 1727204752.83732: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204752.84498: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204752.84546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204752.84572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204752.84588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204752.84703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204752.86445: stdout chunk (state=3): >>>/root <<< 44071 1727204752.86585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204752.86633: stderr chunk (state=3): >>><<< 44071 1727204752.86635: stdout chunk (state=3): >>><<< 44071 1727204752.86651: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204752.86735: _low_level_execute_command(): starting 44071 1727204752.86739: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204752.8665667-53358-224868811283895 `" && echo ansible-tmp-1727204752.8665667-53358-224868811283895="` echo /root/.ansible/tmp/ansible-tmp-1727204752.8665667-53358-224868811283895 `" ) && sleep 0' 44071 1727204752.87165: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204752.87172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204752.87176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204752.87187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204752.87235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204752.87242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204752.87244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204752.87311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204752.89299: stdout chunk (state=3): >>>ansible-tmp-1727204752.8665667-53358-224868811283895=/root/.ansible/tmp/ansible-tmp-1727204752.8665667-53358-224868811283895 <<< 44071 1727204752.89407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204752.89471: stderr chunk (state=3): >>><<< 44071 1727204752.89475: stdout chunk (state=3): >>><<< 44071 1727204752.89494: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204752.8665667-53358-224868811283895=/root/.ansible/tmp/ansible-tmp-1727204752.8665667-53358-224868811283895 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204752.89542: variable 'ansible_module_compression' from source: unknown 44071 1727204752.89597: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 44071 1727204752.89634: variable 'ansible_facts' from source: unknown 44071 1727204752.89700: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204752.8665667-53358-224868811283895/AnsiballZ_stat.py 44071 1727204752.89814: Sending initial data 44071 1727204752.89817: Sent initial data (153 bytes) 44071 1727204752.90320: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204752.90324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204752.90327: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204752.90332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204752.90335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204752.90389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204752.90392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204752.90473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204752.92080: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204752.92187: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204752.92271: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp92jkuw91 /root/.ansible/tmp/ansible-tmp-1727204752.8665667-53358-224868811283895/AnsiballZ_stat.py <<< 44071 1727204752.92275: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204752.8665667-53358-224868811283895/AnsiballZ_stat.py" <<< 44071 1727204752.92334: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmp92jkuw91" to remote "/root/.ansible/tmp/ansible-tmp-1727204752.8665667-53358-224868811283895/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204752.8665667-53358-224868811283895/AnsiballZ_stat.py" <<< 44071 1727204752.93226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204752.93422: stderr chunk (state=3): >>><<< 44071 1727204752.93425: stdout chunk (state=3): >>><<< 44071 1727204752.93427: done transferring module to remote 44071 1727204752.93432: _low_level_execute_command(): starting 44071 1727204752.93434: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204752.8665667-53358-224868811283895/ /root/.ansible/tmp/ansible-tmp-1727204752.8665667-53358-224868811283895/AnsiballZ_stat.py && sleep 0' 44071 1727204752.94108: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204752.94126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204752.94193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204752.94261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204752.94304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204752.94344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204752.94428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204752.96474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204752.96479: stdout chunk (state=3): >>><<< 44071 1727204752.96481: stderr chunk (state=3): >>><<< 44071 1727204752.96484: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204752.96487: _low_level_execute_command(): starting 44071 1727204752.96489: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204752.8665667-53358-224868811283895/AnsiballZ_stat.py && sleep 0' 44071 1727204752.97093: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204752.97122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204752.97135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204752.97157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204752.97261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204753.13741: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 44071 1727204753.15087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204753.15147: stderr chunk (state=3): >>><<< 44071 1727204753.15151: stdout chunk (state=3): >>><<< 44071 1727204753.15170: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204753.15196: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204752.8665667-53358-224868811283895/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204753.15208: _low_level_execute_command(): starting 44071 1727204753.15213: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204752.8665667-53358-224868811283895/ > /dev/null 2>&1 && sleep 0' 44071 1727204753.15977: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204753.15981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204753.15983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204753.15985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204753.15988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204753.15990: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204753.15992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204753.15994: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204753.15996: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204753.15998: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204753.16000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204753.16002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204753.16004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204753.16006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204753.16008: stderr chunk (state=3): >>>debug2: match found <<< 44071 1727204753.16010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204753.16018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204753.16043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204753.16050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204753.16151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204753.18060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204753.18119: stderr chunk (state=3): >>><<< 44071 1727204753.18123: stdout chunk (state=3): >>><<< 44071 1727204753.18143: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204753.18149: handler run complete 44071 1727204753.18169: attempt loop complete, returning result 44071 1727204753.18172: _execute() done 44071 1727204753.18174: dumping result to json 44071 1727204753.18179: done dumping result, returning 44071 1727204753.18187: done running TaskExecutor() for managed-node2/TASK: Get stat for interface statebr [127b8e07-fff9-c964-7471-000000002979] 44071 1727204753.18192: sending task result for task 127b8e07-fff9-c964-7471-000000002979 44071 1727204753.18299: done sending task result for task 127b8e07-fff9-c964-7471-000000002979 44071 1727204753.18302: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 44071 1727204753.18371: no more pending results, returning what we have 44071 1727204753.18376: results queue empty 44071 1727204753.18377: checking for any_errors_fatal 44071 1727204753.18379: done checking for any_errors_fatal 44071 1727204753.18379: checking for max_fail_percentage 44071 1727204753.18381: done checking for max_fail_percentage 44071 1727204753.18382: checking to see if all hosts have failed and the running result is not ok 44071 1727204753.18383: done checking to see if all hosts have failed 44071 1727204753.18383: getting the remaining hosts for this loop 44071 1727204753.18385: done getting the remaining hosts for this loop 44071 1727204753.18390: getting the next task for host managed-node2 44071 1727204753.18402: done getting next task for host managed-node2 44071 1727204753.18406: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 44071 1727204753.18413: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204753.18420: getting variables 44071 1727204753.18422: in VariableManager get_vars() 44071 1727204753.18480: Calling all_inventory to load vars for managed-node2 44071 1727204753.18483: Calling groups_inventory to load vars for managed-node2 44071 1727204753.18486: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204753.18500: Calling all_plugins_play to load vars for managed-node2 44071 1727204753.18502: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204753.18505: Calling groups_plugins_play to load vars for managed-node2 44071 1727204753.19590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204753.20974: done with get_vars() 44071 1727204753.21001: done getting variables 44071 1727204753.21052: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204753.21158: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:05:53 -0400 (0:00:00.403) 0:02:45.528 ***** 44071 1727204753.21186: entering _queue_task() for managed-node2/assert 44071 1727204753.21500: worker is 1 (out of 1 available) 44071 1727204753.21517: exiting _queue_task() for managed-node2/assert 44071 1727204753.21531: done queuing things up, now waiting for results queue to drain 44071 1727204753.21533: waiting for pending results... 44071 1727204753.21746: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' 44071 1727204753.21859: in run() - task 127b8e07-fff9-c964-7471-0000000028d4 44071 1727204753.21876: variable 'ansible_search_path' from source: unknown 44071 1727204753.21880: variable 'ansible_search_path' from source: unknown 44071 1727204753.21915: calling self._execute() 44071 1727204753.22013: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204753.22020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204753.22030: variable 'omit' from source: magic vars 44071 1727204753.22375: variable 'ansible_distribution_major_version' from source: facts 44071 1727204753.22389: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204753.22394: variable 'omit' from source: magic vars 44071 1727204753.22441: variable 'omit' from source: magic vars 44071 1727204753.22530: variable 'interface' from source: play vars 44071 1727204753.22547: variable 'omit' from source: magic vars 44071 1727204753.22587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204753.22617: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204753.22642: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204753.22658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204753.22671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204753.22696: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204753.22700: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204753.22703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204753.22788: Set connection var ansible_connection to ssh 44071 1727204753.22794: Set connection var ansible_timeout to 10 44071 1727204753.22800: Set connection var ansible_pipelining to False 44071 1727204753.22806: Set connection var ansible_shell_type to sh 44071 1727204753.22812: Set connection var ansible_shell_executable to /bin/sh 44071 1727204753.22819: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204753.22841: variable 'ansible_shell_executable' from source: unknown 44071 1727204753.22844: variable 'ansible_connection' from source: unknown 44071 1727204753.22847: variable 'ansible_module_compression' from source: unknown 44071 1727204753.22852: variable 'ansible_shell_type' from source: unknown 44071 1727204753.22855: variable 'ansible_shell_executable' from source: unknown 44071 1727204753.22857: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204753.22860: variable 'ansible_pipelining' from source: unknown 44071 1727204753.22863: variable 'ansible_timeout' from source: unknown 44071 1727204753.22865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204753.22989: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204753.22992: variable 'omit' from source: magic vars 44071 1727204753.22998: starting attempt loop 44071 1727204753.23001: running the handler 44071 1727204753.23125: variable 'interface_stat' from source: set_fact 44071 1727204753.23136: Evaluated conditional (not interface_stat.stat.exists): True 44071 1727204753.23143: handler run complete 44071 1727204753.23156: attempt loop complete, returning result 44071 1727204753.23159: _execute() done 44071 1727204753.23161: dumping result to json 44071 1727204753.23164: done dumping result, returning 44071 1727204753.23174: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'statebr' [127b8e07-fff9-c964-7471-0000000028d4] 44071 1727204753.23179: sending task result for task 127b8e07-fff9-c964-7471-0000000028d4 44071 1727204753.23277: done sending task result for task 127b8e07-fff9-c964-7471-0000000028d4 44071 1727204753.23280: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 44071 1727204753.23357: no more pending results, returning what we have 44071 1727204753.23361: results queue empty 44071 1727204753.23361: checking for any_errors_fatal 44071 1727204753.23378: done checking for any_errors_fatal 44071 1727204753.23379: checking for max_fail_percentage 44071 1727204753.23381: done checking for max_fail_percentage 44071 1727204753.23382: checking to see if all hosts have failed and the running result is not ok 44071 1727204753.23383: done checking to see if all hosts have failed 44071 1727204753.23384: getting the remaining hosts for this loop 44071 1727204753.23386: done getting the remaining hosts for this loop 44071 1727204753.23390: getting the next task for host managed-node2 44071 1727204753.23400: done getting next task for host managed-node2 44071 1727204753.23404: ^ task is: TASK: Success in test '{{ lsr_description }}' 44071 1727204753.23407: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204753.23410: getting variables 44071 1727204753.23412: in VariableManager get_vars() 44071 1727204753.23457: Calling all_inventory to load vars for managed-node2 44071 1727204753.23460: Calling groups_inventory to load vars for managed-node2 44071 1727204753.23464: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204753.23483: Calling all_plugins_play to load vars for managed-node2 44071 1727204753.23487: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204753.23489: Calling groups_plugins_play to load vars for managed-node2 44071 1727204753.24545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204753.25773: done with get_vars() 44071 1727204753.25804: done getting variables 44071 1727204753.25854: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 44071 1727204753.25955: variable 'lsr_description' from source: include params TASK [Success in test 'I will not get an error when I try to remove an absent profile'] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 15:05:53 -0400 (0:00:00.047) 0:02:45.576 ***** 44071 1727204753.25984: entering _queue_task() for managed-node2/debug 44071 1727204753.26289: worker is 1 (out of 1 available) 44071 1727204753.26304: exiting _queue_task() for managed-node2/debug 44071 1727204753.26319: done queuing things up, now waiting for results queue to drain 44071 1727204753.26321: waiting for pending results... 44071 1727204753.26537: running TaskExecutor() for managed-node2/TASK: Success in test 'I will not get an error when I try to remove an absent profile' 44071 1727204753.26629: in run() - task 127b8e07-fff9-c964-7471-0000000020b4 44071 1727204753.26646: variable 'ansible_search_path' from source: unknown 44071 1727204753.26650: variable 'ansible_search_path' from source: unknown 44071 1727204753.26689: calling self._execute() 44071 1727204753.26784: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204753.26790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204753.26800: variable 'omit' from source: magic vars 44071 1727204753.27149: variable 'ansible_distribution_major_version' from source: facts 44071 1727204753.27161: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204753.27169: variable 'omit' from source: magic vars 44071 1727204753.27208: variable 'omit' from source: magic vars 44071 1727204753.27298: variable 'lsr_description' from source: include params 44071 1727204753.27315: variable 'omit' from source: magic vars 44071 1727204753.27358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204753.27391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204753.27408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204753.27425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204753.27440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204753.27468: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204753.27471: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204753.27474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204753.27555: Set connection var ansible_connection to ssh 44071 1727204753.27561: Set connection var ansible_timeout to 10 44071 1727204753.27570: Set connection var ansible_pipelining to False 44071 1727204753.27575: Set connection var ansible_shell_type to sh 44071 1727204753.27580: Set connection var ansible_shell_executable to /bin/sh 44071 1727204753.27587: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204753.27607: variable 'ansible_shell_executable' from source: unknown 44071 1727204753.27611: variable 'ansible_connection' from source: unknown 44071 1727204753.27614: variable 'ansible_module_compression' from source: unknown 44071 1727204753.27617: variable 'ansible_shell_type' from source: unknown 44071 1727204753.27620: variable 'ansible_shell_executable' from source: unknown 44071 1727204753.27623: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204753.27625: variable 'ansible_pipelining' from source: unknown 44071 1727204753.27629: variable 'ansible_timeout' from source: unknown 44071 1727204753.27636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204753.27754: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204753.27767: variable 'omit' from source: magic vars 44071 1727204753.27770: starting attempt loop 44071 1727204753.27773: running the handler 44071 1727204753.27812: handler run complete 44071 1727204753.27824: attempt loop complete, returning result 44071 1727204753.27828: _execute() done 44071 1727204753.27831: dumping result to json 44071 1727204753.27836: done dumping result, returning 44071 1727204753.27844: done running TaskExecutor() for managed-node2/TASK: Success in test 'I will not get an error when I try to remove an absent profile' [127b8e07-fff9-c964-7471-0000000020b4] 44071 1727204753.27852: sending task result for task 127b8e07-fff9-c964-7471-0000000020b4 44071 1727204753.27948: done sending task result for task 127b8e07-fff9-c964-7471-0000000020b4 44071 1727204753.27951: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: +++++ Success in test 'I will not get an error when I try to remove an absent profile' +++++ 44071 1727204753.28010: no more pending results, returning what we have 44071 1727204753.28014: results queue empty 44071 1727204753.28014: checking for any_errors_fatal 44071 1727204753.28023: done checking for any_errors_fatal 44071 1727204753.28024: checking for max_fail_percentage 44071 1727204753.28026: done checking for max_fail_percentage 44071 1727204753.28027: checking to see if all hosts have failed and the running result is not ok 44071 1727204753.28028: done checking to see if all hosts have failed 44071 1727204753.28028: getting the remaining hosts for this loop 44071 1727204753.28030: done getting the remaining hosts for this loop 44071 1727204753.28035: getting the next task for host managed-node2 44071 1727204753.28046: done getting next task for host managed-node2 44071 1727204753.28049: ^ task is: TASK: Cleanup 44071 1727204753.28052: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204753.28058: getting variables 44071 1727204753.28059: in VariableManager get_vars() 44071 1727204753.28118: Calling all_inventory to load vars for managed-node2 44071 1727204753.28121: Calling groups_inventory to load vars for managed-node2 44071 1727204753.28125: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204753.28137: Calling all_plugins_play to load vars for managed-node2 44071 1727204753.28140: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204753.28143: Calling groups_plugins_play to load vars for managed-node2 44071 1727204753.29392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204753.30612: done with get_vars() 44071 1727204753.30644: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 15:05:53 -0400 (0:00:00.047) 0:02:45.623 ***** 44071 1727204753.30727: entering _queue_task() for managed-node2/include_tasks 44071 1727204753.31043: worker is 1 (out of 1 available) 44071 1727204753.31061: exiting _queue_task() for managed-node2/include_tasks 44071 1727204753.31078: done queuing things up, now waiting for results queue to drain 44071 1727204753.31080: waiting for pending results... 44071 1727204753.31305: running TaskExecutor() for managed-node2/TASK: Cleanup 44071 1727204753.31403: in run() - task 127b8e07-fff9-c964-7471-0000000020b8 44071 1727204753.31422: variable 'ansible_search_path' from source: unknown 44071 1727204753.31429: variable 'ansible_search_path' from source: unknown 44071 1727204753.31470: variable 'lsr_cleanup' from source: include params 44071 1727204753.31676: variable 'lsr_cleanup' from source: include params 44071 1727204753.31740: variable 'omit' from source: magic vars 44071 1727204753.31874: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204753.31882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204753.31894: variable 'omit' from source: magic vars 44071 1727204753.32114: variable 'ansible_distribution_major_version' from source: facts 44071 1727204753.32123: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204753.32129: variable 'item' from source: unknown 44071 1727204753.32190: variable 'item' from source: unknown 44071 1727204753.32219: variable 'item' from source: unknown 44071 1727204753.32269: variable 'item' from source: unknown 44071 1727204753.32421: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204753.32425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204753.32428: variable 'omit' from source: magic vars 44071 1727204753.32519: variable 'ansible_distribution_major_version' from source: facts 44071 1727204753.32523: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204753.32530: variable 'item' from source: unknown 44071 1727204753.32582: variable 'item' from source: unknown 44071 1727204753.32606: variable 'item' from source: unknown 44071 1727204753.32656: variable 'item' from source: unknown 44071 1727204753.32730: dumping result to json 44071 1727204753.32733: done dumping result, returning 44071 1727204753.32735: done running TaskExecutor() for managed-node2/TASK: Cleanup [127b8e07-fff9-c964-7471-0000000020b8] 44071 1727204753.32738: sending task result for task 127b8e07-fff9-c964-7471-0000000020b8 44071 1727204753.32781: done sending task result for task 127b8e07-fff9-c964-7471-0000000020b8 44071 1727204753.32784: WORKER PROCESS EXITING 44071 1727204753.32813: no more pending results, returning what we have 44071 1727204753.32818: in VariableManager get_vars() 44071 1727204753.32879: Calling all_inventory to load vars for managed-node2 44071 1727204753.32882: Calling groups_inventory to load vars for managed-node2 44071 1727204753.32885: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204753.32903: Calling all_plugins_play to load vars for managed-node2 44071 1727204753.32906: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204753.32909: Calling groups_plugins_play to load vars for managed-node2 44071 1727204753.34012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204753.35408: done with get_vars() 44071 1727204753.35433: variable 'ansible_search_path' from source: unknown 44071 1727204753.35434: variable 'ansible_search_path' from source: unknown 44071 1727204753.35473: variable 'ansible_search_path' from source: unknown 44071 1727204753.35474: variable 'ansible_search_path' from source: unknown 44071 1727204753.35496: we have included files to process 44071 1727204753.35497: generating all_blocks data 44071 1727204753.35499: done generating all_blocks data 44071 1727204753.35503: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204753.35504: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204753.35506: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 44071 1727204753.35663: done processing included file 44071 1727204753.35664: iterating over new_blocks loaded from include file 44071 1727204753.35667: in VariableManager get_vars() 44071 1727204753.35682: done with get_vars() 44071 1727204753.35683: filtering new block on tags 44071 1727204753.35705: done filtering new block on tags 44071 1727204753.35707: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed-node2 => (item=tasks/cleanup_profile+device.yml) 44071 1727204753.35711: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 44071 1727204753.35711: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 44071 1727204753.35714: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 44071 1727204753.35973: done processing included file 44071 1727204753.35975: iterating over new_blocks loaded from include file 44071 1727204753.35976: in VariableManager get_vars() 44071 1727204753.35989: done with get_vars() 44071 1727204753.35990: filtering new block on tags 44071 1727204753.36012: done filtering new block on tags 44071 1727204753.36014: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node2 => (item=tasks/check_network_dns.yml) 44071 1727204753.36018: extending task lists for all hosts with included blocks 44071 1727204753.37135: done extending task lists 44071 1727204753.37136: done processing included files 44071 1727204753.37137: results queue empty 44071 1727204753.37137: checking for any_errors_fatal 44071 1727204753.37141: done checking for any_errors_fatal 44071 1727204753.37142: checking for max_fail_percentage 44071 1727204753.37142: done checking for max_fail_percentage 44071 1727204753.37143: checking to see if all hosts have failed and the running result is not ok 44071 1727204753.37143: done checking to see if all hosts have failed 44071 1727204753.37144: getting the remaining hosts for this loop 44071 1727204753.37145: done getting the remaining hosts for this loop 44071 1727204753.37147: getting the next task for host managed-node2 44071 1727204753.37150: done getting next task for host managed-node2 44071 1727204753.37152: ^ task is: TASK: Cleanup profile and device 44071 1727204753.37155: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204753.37157: getting variables 44071 1727204753.37157: in VariableManager get_vars() 44071 1727204753.37172: Calling all_inventory to load vars for managed-node2 44071 1727204753.37179: Calling groups_inventory to load vars for managed-node2 44071 1727204753.37181: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204753.37187: Calling all_plugins_play to load vars for managed-node2 44071 1727204753.37189: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204753.37191: Calling groups_plugins_play to load vars for managed-node2 44071 1727204753.38093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204753.39322: done with get_vars() 44071 1727204753.39355: done getting variables 44071 1727204753.39403: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Tuesday 24 September 2024 15:05:53 -0400 (0:00:00.086) 0:02:45.710 ***** 44071 1727204753.39431: entering _queue_task() for managed-node2/shell 44071 1727204753.39754: worker is 1 (out of 1 available) 44071 1727204753.39772: exiting _queue_task() for managed-node2/shell 44071 1727204753.39787: done queuing things up, now waiting for results queue to drain 44071 1727204753.39788: waiting for pending results... 44071 1727204753.39995: running TaskExecutor() for managed-node2/TASK: Cleanup profile and device 44071 1727204753.40092: in run() - task 127b8e07-fff9-c964-7471-00000000299e 44071 1727204753.40106: variable 'ansible_search_path' from source: unknown 44071 1727204753.40111: variable 'ansible_search_path' from source: unknown 44071 1727204753.40147: calling self._execute() 44071 1727204753.40239: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204753.40250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204753.40260: variable 'omit' from source: magic vars 44071 1727204753.40604: variable 'ansible_distribution_major_version' from source: facts 44071 1727204753.40615: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204753.40622: variable 'omit' from source: magic vars 44071 1727204753.40663: variable 'omit' from source: magic vars 44071 1727204753.40785: variable 'interface' from source: play vars 44071 1727204753.40808: variable 'omit' from source: magic vars 44071 1727204753.40847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204753.40880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204753.40903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204753.40918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204753.40930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204753.40960: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204753.40963: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204753.40968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204753.41052: Set connection var ansible_connection to ssh 44071 1727204753.41057: Set connection var ansible_timeout to 10 44071 1727204753.41063: Set connection var ansible_pipelining to False 44071 1727204753.41070: Set connection var ansible_shell_type to sh 44071 1727204753.41076: Set connection var ansible_shell_executable to /bin/sh 44071 1727204753.41083: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204753.41103: variable 'ansible_shell_executable' from source: unknown 44071 1727204753.41107: variable 'ansible_connection' from source: unknown 44071 1727204753.41110: variable 'ansible_module_compression' from source: unknown 44071 1727204753.41115: variable 'ansible_shell_type' from source: unknown 44071 1727204753.41118: variable 'ansible_shell_executable' from source: unknown 44071 1727204753.41121: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204753.41123: variable 'ansible_pipelining' from source: unknown 44071 1727204753.41126: variable 'ansible_timeout' from source: unknown 44071 1727204753.41128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204753.41251: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204753.41261: variable 'omit' from source: magic vars 44071 1727204753.41268: starting attempt loop 44071 1727204753.41271: running the handler 44071 1727204753.41281: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204753.41297: _low_level_execute_command(): starting 44071 1727204753.41304: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204753.41872: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204753.41877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 44071 1727204753.41883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204753.41895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204753.41937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204753.41941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204753.41953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204753.42035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204753.43768: stdout chunk (state=3): >>>/root <<< 44071 1727204753.44116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204753.44121: stdout chunk (state=3): >>><<< 44071 1727204753.44124: stderr chunk (state=3): >>><<< 44071 1727204753.44133: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204753.44136: _low_level_execute_command(): starting 44071 1727204753.44140: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204753.4400208-53375-156072576508940 `" && echo ansible-tmp-1727204753.4400208-53375-156072576508940="` echo /root/.ansible/tmp/ansible-tmp-1727204753.4400208-53375-156072576508940 `" ) && sleep 0' 44071 1727204753.44799: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204753.44826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204753.44855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204753.44879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204753.44898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204753.44920: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204753.45034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204753.45062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204753.45179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204753.55398: stdout chunk (state=3): >>>ansible-tmp-1727204753.4400208-53375-156072576508940=/root/.ansible/tmp/ansible-tmp-1727204753.4400208-53375-156072576508940 <<< 44071 1727204753.55619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204753.55623: stdout chunk (state=3): >>><<< 44071 1727204753.55626: stderr chunk (state=3): >>><<< 44071 1727204753.55631: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204753.4400208-53375-156072576508940=/root/.ansible/tmp/ansible-tmp-1727204753.4400208-53375-156072576508940 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204753.55634: variable 'ansible_module_compression' from source: unknown 44071 1727204753.55738: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204753.55804: variable 'ansible_facts' from source: unknown 44071 1727204753.55917: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204753.4400208-53375-156072576508940/AnsiballZ_command.py 44071 1727204753.56130: Sending initial data 44071 1727204753.56134: Sent initial data (156 bytes) 44071 1727204753.57245: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204753.57273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204753.57294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204753.57326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204753.57425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204753.59040: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204753.59138: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204753.59216: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpvp93mo97 /root/.ansible/tmp/ansible-tmp-1727204753.4400208-53375-156072576508940/AnsiballZ_command.py <<< 44071 1727204753.59220: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204753.4400208-53375-156072576508940/AnsiballZ_command.py" <<< 44071 1727204753.59291: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpvp93mo97" to remote "/root/.ansible/tmp/ansible-tmp-1727204753.4400208-53375-156072576508940/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204753.4400208-53375-156072576508940/AnsiballZ_command.py" <<< 44071 1727204753.60202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204753.60255: stderr chunk (state=3): >>><<< 44071 1727204753.60274: stdout chunk (state=3): >>><<< 44071 1727204753.60316: done transferring module to remote 44071 1727204753.60337: _low_level_execute_command(): starting 44071 1727204753.60348: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204753.4400208-53375-156072576508940/ /root/.ansible/tmp/ansible-tmp-1727204753.4400208-53375-156072576508940/AnsiballZ_command.py && sleep 0' 44071 1727204753.61092: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204753.61191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204753.61224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204753.61244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204753.61276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204753.61381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204753.63303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204753.63307: stdout chunk (state=3): >>><<< 44071 1727204753.63310: stderr chunk (state=3): >>><<< 44071 1727204753.63327: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204753.63424: _low_level_execute_command(): starting 44071 1727204753.63428: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204753.4400208-53375-156072576508940/AnsiballZ_command.py && sleep 0' 44071 1727204753.64016: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204753.64033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204753.64048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204753.64070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204753.64087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204753.64097: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204753.64110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204753.64127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204753.64139: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204753.64186: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204753.64248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204753.64269: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204753.64292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204753.64400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204753.84656: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:05:53.808089", "end": "2024-09-24 15:05:53.844794", "delta": "0:00:00.036705", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204753.86479: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. <<< 44071 1727204753.86485: stdout chunk (state=3): >>><<< 44071 1727204753.86487: stderr chunk (state=3): >>><<< 44071 1727204753.86517: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-24 15:05:53.808089", "end": "2024-09-24 15:05:53.844794", "delta": "0:00:00.036705", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. 44071 1727204753.86679: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204753.4400208-53375-156072576508940/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204753.86688: _low_level_execute_command(): starting 44071 1727204753.86691: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204753.4400208-53375-156072576508940/ > /dev/null 2>&1 && sleep 0' 44071 1727204753.87270: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204753.87375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204753.87400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204753.87508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204753.89875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204753.89880: stdout chunk (state=3): >>><<< 44071 1727204753.89883: stderr chunk (state=3): >>><<< 44071 1727204753.89885: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204753.89888: handler run complete 44071 1727204753.89890: Evaluated conditional (False): False 44071 1727204753.89892: attempt loop complete, returning result 44071 1727204753.89894: _execute() done 44071 1727204753.89896: dumping result to json 44071 1727204753.89898: done dumping result, returning 44071 1727204753.89900: done running TaskExecutor() for managed-node2/TASK: Cleanup profile and device [127b8e07-fff9-c964-7471-00000000299e] 44071 1727204753.89902: sending task result for task 127b8e07-fff9-c964-7471-00000000299e 44071 1727204753.89986: done sending task result for task 127b8e07-fff9-c964-7471-00000000299e 44071 1727204753.89990: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.036705", "end": "2024-09-24 15:05:53.844794", "rc": 1, "start": "2024-09-24 15:05:53.808089" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 44071 1727204753.90080: no more pending results, returning what we have 44071 1727204753.90085: results queue empty 44071 1727204753.90086: checking for any_errors_fatal 44071 1727204753.90088: done checking for any_errors_fatal 44071 1727204753.90089: checking for max_fail_percentage 44071 1727204753.90091: done checking for max_fail_percentage 44071 1727204753.90093: checking to see if all hosts have failed and the running result is not ok 44071 1727204753.90094: done checking to see if all hosts have failed 44071 1727204753.90094: getting the remaining hosts for this loop 44071 1727204753.90096: done getting the remaining hosts for this loop 44071 1727204753.90108: getting the next task for host managed-node2 44071 1727204753.90122: done getting next task for host managed-node2 44071 1727204753.90125: ^ task is: TASK: Check routes and DNS 44071 1727204753.90130: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204753.90136: getting variables 44071 1727204753.90138: in VariableManager get_vars() 44071 1727204753.90199: Calling all_inventory to load vars for managed-node2 44071 1727204753.90202: Calling groups_inventory to load vars for managed-node2 44071 1727204753.90206: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204753.90285: Calling all_plugins_play to load vars for managed-node2 44071 1727204753.90290: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204753.90294: Calling groups_plugins_play to load vars for managed-node2 44071 1727204753.92737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204753.94636: done with get_vars() 44071 1727204753.94670: done getting variables 44071 1727204753.94723: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 15:05:53 -0400 (0:00:00.553) 0:02:46.264 ***** 44071 1727204753.94753: entering _queue_task() for managed-node2/shell 44071 1727204753.95070: worker is 1 (out of 1 available) 44071 1727204753.95086: exiting _queue_task() for managed-node2/shell 44071 1727204753.95102: done queuing things up, now waiting for results queue to drain 44071 1727204753.95104: waiting for pending results... 44071 1727204753.95320: running TaskExecutor() for managed-node2/TASK: Check routes and DNS 44071 1727204753.95412: in run() - task 127b8e07-fff9-c964-7471-0000000029a2 44071 1727204753.95425: variable 'ansible_search_path' from source: unknown 44071 1727204753.95432: variable 'ansible_search_path' from source: unknown 44071 1727204753.95468: calling self._execute() 44071 1727204753.95559: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204753.95563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204753.95577: variable 'omit' from source: magic vars 44071 1727204753.95975: variable 'ansible_distribution_major_version' from source: facts 44071 1727204753.95992: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204753.95995: variable 'omit' from source: magic vars 44071 1727204753.96061: variable 'omit' from source: magic vars 44071 1727204753.96372: variable 'omit' from source: magic vars 44071 1727204753.96375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 44071 1727204753.96378: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 44071 1727204753.96381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 44071 1727204753.96384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204753.96386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 44071 1727204753.96388: variable 'inventory_hostname' from source: host vars for 'managed-node2' 44071 1727204753.96390: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204753.96393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204753.96395: Set connection var ansible_connection to ssh 44071 1727204753.96398: Set connection var ansible_timeout to 10 44071 1727204753.96408: Set connection var ansible_pipelining to False 44071 1727204753.96412: Set connection var ansible_shell_type to sh 44071 1727204753.96418: Set connection var ansible_shell_executable to /bin/sh 44071 1727204753.96426: Set connection var ansible_module_compression to ZIP_DEFLATED 44071 1727204753.96460: variable 'ansible_shell_executable' from source: unknown 44071 1727204753.96464: variable 'ansible_connection' from source: unknown 44071 1727204753.96469: variable 'ansible_module_compression' from source: unknown 44071 1727204753.96475: variable 'ansible_shell_type' from source: unknown 44071 1727204753.96478: variable 'ansible_shell_executable' from source: unknown 44071 1727204753.96480: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204753.96486: variable 'ansible_pipelining' from source: unknown 44071 1727204753.96488: variable 'ansible_timeout' from source: unknown 44071 1727204753.96494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204753.96669: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204753.96687: variable 'omit' from source: magic vars 44071 1727204753.96693: starting attempt loop 44071 1727204753.96697: running the handler 44071 1727204753.96708: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 44071 1727204753.96734: _low_level_execute_command(): starting 44071 1727204753.96740: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 44071 1727204753.97510: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204753.97526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204753.97572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204753.97577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204753.97580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 44071 1727204753.97583: stderr chunk (state=3): >>>debug2: match not found <<< 44071 1727204753.97586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204753.97600: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 44071 1727204753.97607: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 44071 1727204753.97616: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 44071 1727204753.97624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204753.97637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204753.97650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204753.97682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204753.97741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204753.97763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204753.97784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204753.97893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204753.99721: stdout chunk (state=3): >>>/root <<< 44071 1727204753.99885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204753.99895: stderr chunk (state=3): >>><<< 44071 1727204753.99898: stdout chunk (state=3): >>><<< 44071 1727204753.99919: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204753.99938: _low_level_execute_command(): starting 44071 1727204753.99950: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204753.9991994-53400-176467362934578 `" && echo ansible-tmp-1727204753.9991994-53400-176467362934578="` echo /root/.ansible/tmp/ansible-tmp-1727204753.9991994-53400-176467362934578 `" ) && sleep 0' 44071 1727204754.00775: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204754.00780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204754.00881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204754.02999: stdout chunk (state=3): >>>ansible-tmp-1727204753.9991994-53400-176467362934578=/root/.ansible/tmp/ansible-tmp-1727204753.9991994-53400-176467362934578 <<< 44071 1727204754.03105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204754.03173: stderr chunk (state=3): >>><<< 44071 1727204754.03176: stdout chunk (state=3): >>><<< 44071 1727204754.03196: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204753.9991994-53400-176467362934578=/root/.ansible/tmp/ansible-tmp-1727204753.9991994-53400-176467362934578 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204754.03233: variable 'ansible_module_compression' from source: unknown 44071 1727204754.03277: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-44071yrmxgf_o/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 44071 1727204754.03311: variable 'ansible_facts' from source: unknown 44071 1727204754.03373: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204753.9991994-53400-176467362934578/AnsiballZ_command.py 44071 1727204754.03492: Sending initial data 44071 1727204754.03496: Sent initial data (156 bytes) 44071 1727204754.04007: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204754.04012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204754.04015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 44071 1727204754.04018: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204754.04069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204754.04073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204754.04075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204754.04154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204754.05895: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 44071 1727204754.05959: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 44071 1727204754.06035: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpehx79x3m /root/.ansible/tmp/ansible-tmp-1727204753.9991994-53400-176467362934578/AnsiballZ_command.py <<< 44071 1727204754.06039: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204753.9991994-53400-176467362934578/AnsiballZ_command.py" <<< 44071 1727204754.06103: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-44071yrmxgf_o/tmpehx79x3m" to remote "/root/.ansible/tmp/ansible-tmp-1727204753.9991994-53400-176467362934578/AnsiballZ_command.py" <<< 44071 1727204754.06106: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204753.9991994-53400-176467362934578/AnsiballZ_command.py" <<< 44071 1727204754.06774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204754.06857: stderr chunk (state=3): >>><<< 44071 1727204754.06861: stdout chunk (state=3): >>><<< 44071 1727204754.06883: done transferring module to remote 44071 1727204754.06894: _low_level_execute_command(): starting 44071 1727204754.06899: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204753.9991994-53400-176467362934578/ /root/.ansible/tmp/ansible-tmp-1727204753.9991994-53400-176467362934578/AnsiballZ_command.py && sleep 0' 44071 1727204754.07408: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204754.07416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204754.07419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204754.07421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204754.07475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204754.07490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204754.07556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204754.09462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204754.09523: stderr chunk (state=3): >>><<< 44071 1727204754.09526: stdout chunk (state=3): >>><<< 44071 1727204754.09545: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204754.09550: _low_level_execute_command(): starting 44071 1727204754.09553: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204753.9991994-53400-176467362934578/AnsiballZ_command.py && sleep 0' 44071 1727204754.10053: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 44071 1727204754.10059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204754.10062: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 44071 1727204754.10065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204754.10118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204754.10122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204754.10129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204754.10207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204754.27785: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:f7:13:22:8f:c1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.47.73/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2761sec preferred_lft 2761sec\n inet6 fe80::f7:13ff:fe22:8fc1/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.47.73 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.47.73 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:05:54.266137", "end": "2024-09-24 15:05:54.275939", "delta": "0:00:00.009802", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 44071 1727204754.29554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 44071 1727204754.29558: stdout chunk (state=3): >>><<< 44071 1727204754.29772: stderr chunk (state=3): >>><<< 44071 1727204754.29778: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:f7:13:22:8f:c1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.47.73/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2761sec preferred_lft 2761sec\n inet6 fe80::f7:13ff:fe22:8fc1/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.47.73 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.47.73 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:05:54.266137", "end": "2024-09-24 15:05:54.275939", "delta": "0:00:00.009802", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 44071 1727204754.29787: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204753.9991994-53400-176467362934578/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 44071 1727204754.29790: _low_level_execute_command(): starting 44071 1727204754.29793: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204753.9991994-53400-176467362934578/ > /dev/null 2>&1 && sleep 0' 44071 1727204754.30448: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 44071 1727204754.30452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 44071 1727204754.30507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204754.30619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 44071 1727204754.30661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 44071 1727204754.30686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 44071 1727204754.30705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 44071 1727204754.30812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 44071 1727204754.32839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 44071 1727204754.32843: stdout chunk (state=3): >>><<< 44071 1727204754.32846: stderr chunk (state=3): >>><<< 44071 1727204754.32868: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 44071 1727204754.32990: handler run complete 44071 1727204754.32995: Evaluated conditional (False): False 44071 1727204754.32998: attempt loop complete, returning result 44071 1727204754.33001: _execute() done 44071 1727204754.33004: dumping result to json 44071 1727204754.33007: done dumping result, returning 44071 1727204754.33010: done running TaskExecutor() for managed-node2/TASK: Check routes and DNS [127b8e07-fff9-c964-7471-0000000029a2] 44071 1727204754.33012: sending task result for task 127b8e07-fff9-c964-7471-0000000029a2 44071 1727204754.33287: done sending task result for task 127b8e07-fff9-c964-7471-0000000029a2 44071 1727204754.33292: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009802", "end": "2024-09-24 15:05:54.275939", "rc": 0, "start": "2024-09-24 15:05:54.266137" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:f7:13:22:8f:c1 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.47.73/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 2761sec preferred_lft 2761sec inet6 fe80::f7:13ff:fe22:8fc1/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.47.73 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.47.73 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 44071 1727204754.33377: no more pending results, returning what we have 44071 1727204754.33381: results queue empty 44071 1727204754.33382: checking for any_errors_fatal 44071 1727204754.33389: done checking for any_errors_fatal 44071 1727204754.33390: checking for max_fail_percentage 44071 1727204754.33392: done checking for max_fail_percentage 44071 1727204754.33393: checking to see if all hosts have failed and the running result is not ok 44071 1727204754.33394: done checking to see if all hosts have failed 44071 1727204754.33394: getting the remaining hosts for this loop 44071 1727204754.33396: done getting the remaining hosts for this loop 44071 1727204754.33401: getting the next task for host managed-node2 44071 1727204754.33409: done getting next task for host managed-node2 44071 1727204754.33411: ^ task is: TASK: Verify DNS and network connectivity 44071 1727204754.33419: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204754.33423: getting variables 44071 1727204754.33425: in VariableManager get_vars() 44071 1727204754.33572: Calling all_inventory to load vars for managed-node2 44071 1727204754.33576: Calling groups_inventory to load vars for managed-node2 44071 1727204754.33579: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204754.33591: Calling all_plugins_play to load vars for managed-node2 44071 1727204754.33594: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204754.33596: Calling groups_plugins_play to load vars for managed-node2 44071 1727204754.34967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204754.36259: done with get_vars() 44071 1727204754.36301: done getting variables 44071 1727204754.36369: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 15:05:54 -0400 (0:00:00.416) 0:02:46.680 ***** 44071 1727204754.36407: entering _queue_task() for managed-node2/shell 44071 1727204754.36856: worker is 1 (out of 1 available) 44071 1727204754.37075: exiting _queue_task() for managed-node2/shell 44071 1727204754.37089: done queuing things up, now waiting for results queue to drain 44071 1727204754.37092: waiting for pending results... 44071 1727204754.37288: running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity 44071 1727204754.37432: in run() - task 127b8e07-fff9-c964-7471-0000000029a3 44071 1727204754.37472: variable 'ansible_search_path' from source: unknown 44071 1727204754.37476: variable 'ansible_search_path' from source: unknown 44071 1727204754.37499: calling self._execute() 44071 1727204754.37622: variable 'ansible_host' from source: host vars for 'managed-node2' 44071 1727204754.37646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 44071 1727204754.37756: variable 'omit' from source: magic vars 44071 1727204754.38087: variable 'ansible_distribution_major_version' from source: facts 44071 1727204754.38099: Evaluated conditional (ansible_distribution_major_version != '6'): True 44071 1727204754.38225: variable 'ansible_facts' from source: unknown 44071 1727204754.38859: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 44071 1727204754.38864: when evaluation is False, skipping this task 44071 1727204754.38869: _execute() done 44071 1727204754.38872: dumping result to json 44071 1727204754.38874: done dumping result, returning 44071 1727204754.38877: done running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity [127b8e07-fff9-c964-7471-0000000029a3] 44071 1727204754.38883: sending task result for task 127b8e07-fff9-c964-7471-0000000029a3 44071 1727204754.38985: done sending task result for task 127b8e07-fff9-c964-7471-0000000029a3 44071 1727204754.38988: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 44071 1727204754.39046: no more pending results, returning what we have 44071 1727204754.39050: results queue empty 44071 1727204754.39051: checking for any_errors_fatal 44071 1727204754.39063: done checking for any_errors_fatal 44071 1727204754.39063: checking for max_fail_percentage 44071 1727204754.39067: done checking for max_fail_percentage 44071 1727204754.39068: checking to see if all hosts have failed and the running result is not ok 44071 1727204754.39068: done checking to see if all hosts have failed 44071 1727204754.39069: getting the remaining hosts for this loop 44071 1727204754.39071: done getting the remaining hosts for this loop 44071 1727204754.39076: getting the next task for host managed-node2 44071 1727204754.39089: done getting next task for host managed-node2 44071 1727204754.39092: ^ task is: TASK: meta (flush_handlers) 44071 1727204754.39094: ^ state is: HOST STATE: block=9, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204754.39106: getting variables 44071 1727204754.39107: in VariableManager get_vars() 44071 1727204754.39159: Calling all_inventory to load vars for managed-node2 44071 1727204754.39162: Calling groups_inventory to load vars for managed-node2 44071 1727204754.39168: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204754.39183: Calling all_plugins_play to load vars for managed-node2 44071 1727204754.39186: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204754.39188: Calling groups_plugins_play to load vars for managed-node2 44071 1727204754.40698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204754.42847: done with get_vars() 44071 1727204754.42881: done getting variables 44071 1727204754.42949: in VariableManager get_vars() 44071 1727204754.42962: Calling all_inventory to load vars for managed-node2 44071 1727204754.42964: Calling groups_inventory to load vars for managed-node2 44071 1727204754.42968: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204754.42972: Calling all_plugins_play to load vars for managed-node2 44071 1727204754.42974: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204754.42977: Calling groups_plugins_play to load vars for managed-node2 44071 1727204754.43869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204754.45497: done with get_vars() 44071 1727204754.45557: done queuing things up, now waiting for results queue to drain 44071 1727204754.45560: results queue empty 44071 1727204754.45561: checking for any_errors_fatal 44071 1727204754.45564: done checking for any_errors_fatal 44071 1727204754.45566: checking for max_fail_percentage 44071 1727204754.45568: done checking for max_fail_percentage 44071 1727204754.45569: checking to see if all hosts have failed and the running result is not ok 44071 1727204754.45569: done checking to see if all hosts have failed 44071 1727204754.45570: getting the remaining hosts for this loop 44071 1727204754.45571: done getting the remaining hosts for this loop 44071 1727204754.45575: getting the next task for host managed-node2 44071 1727204754.45579: done getting next task for host managed-node2 44071 1727204754.45580: ^ task is: TASK: meta (flush_handlers) 44071 1727204754.45597: ^ state is: HOST STATE: block=10, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204754.45609: getting variables 44071 1727204754.45610: in VariableManager get_vars() 44071 1727204754.45644: Calling all_inventory to load vars for managed-node2 44071 1727204754.45646: Calling groups_inventory to load vars for managed-node2 44071 1727204754.45648: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204754.45653: Calling all_plugins_play to load vars for managed-node2 44071 1727204754.45655: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204754.45657: Calling groups_plugins_play to load vars for managed-node2 44071 1727204754.47192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204754.49349: done with get_vars() 44071 1727204754.49396: done getting variables 44071 1727204754.49455: in VariableManager get_vars() 44071 1727204754.49475: Calling all_inventory to load vars for managed-node2 44071 1727204754.49478: Calling groups_inventory to load vars for managed-node2 44071 1727204754.49481: Calling all_plugins_inventory to load vars for managed-node2 44071 1727204754.49487: Calling all_plugins_play to load vars for managed-node2 44071 1727204754.49489: Calling groups_plugins_inventory to load vars for managed-node2 44071 1727204754.49492: Calling groups_plugins_play to load vars for managed-node2 44071 1727204754.51071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 44071 1727204754.52404: done with get_vars() 44071 1727204754.52436: done queuing things up, now waiting for results queue to drain 44071 1727204754.52438: results queue empty 44071 1727204754.52439: checking for any_errors_fatal 44071 1727204754.52440: done checking for any_errors_fatal 44071 1727204754.52440: checking for max_fail_percentage 44071 1727204754.52442: done checking for max_fail_percentage 44071 1727204754.52442: checking to see if all hosts have failed and the running result is not ok 44071 1727204754.52443: done checking to see if all hosts have failed 44071 1727204754.52443: getting the remaining hosts for this loop 44071 1727204754.52444: done getting the remaining hosts for this loop 44071 1727204754.52453: getting the next task for host managed-node2 44071 1727204754.52457: done getting next task for host managed-node2 44071 1727204754.52458: ^ task is: None 44071 1727204754.52459: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 44071 1727204754.52461: done queuing things up, now waiting for results queue to drain 44071 1727204754.52461: results queue empty 44071 1727204754.52462: checking for any_errors_fatal 44071 1727204754.52462: done checking for any_errors_fatal 44071 1727204754.52463: checking for max_fail_percentage 44071 1727204754.52463: done checking for max_fail_percentage 44071 1727204754.52464: checking to see if all hosts have failed and the running result is not ok 44071 1727204754.52464: done checking to see if all hosts have failed 44071 1727204754.52469: getting the next task for host managed-node2 44071 1727204754.52471: done getting next task for host managed-node2 44071 1727204754.52472: ^ task is: None 44071 1727204754.52473: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=333 changed=10 unreachable=0 failed=0 skipped=313 rescued=0 ignored=9 Tuesday 24 September 2024 15:05:54 -0400 (0:00:00.161) 0:02:46.842 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.74s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.73s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.72s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.64s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.64s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.61s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.61s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.57s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.56s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.53s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.53s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.50s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.50s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.50s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.49s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which packages are installed --- 1.71s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.56s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.44s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.39s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.34s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 44071 1727204754.52740: RUNNING CLEANUP